I was wondering about the watts for a different reason...
From Wikipedia:
"Although the long term effects due to ultrasound exposure at diagnostic intensity are still unknown,[24] currently most doctors feel that the benefits to patients outweigh the risks.[25] The ALARA (As Low As Reasonably Achievable) principle has been advocated for an ultrasound examination — that is, keeping the scanning time and power settings as low as possible but consistent with diagnostic imaging — and that by that principle non-medical uses, which by definition are not necessary, are actively discouraged."
The difference here is that the 100 dB limit isn't some fundamental law of physics. It's one of the currently accepted standards for safe use of ultrasound. Taking a look through the literature, though, you can find various numbers that say ultrasound several orders of magnitude stronger have had no observable long-term adverse effects. Basically, there's an intensity level that we're pretty sure is harmful, and an intensity level that we're pretty sure is safe, and, as it happens, the intensity level necessary to make uBeam useful is somewhere in between.
Investing in uBeam requires a calculated wager that the threshold for ultrasound safety is a fair bit higher than 100 dB. It's a risky position to take, and I'm not sure how to calculate it, but it's not pure lunacy.
A giant 0.1 m^2 tablet charging at 12W (120 W/m^2) needs only a 140 dB field. A watch, which might need 1W but with a cross section of more like .001 m^2 (for 1000 W/m^2), needs a 150 dB field.
If something intercepts the beam at 1m instead of the 3m the device is charging at, the cross section of the beam is about 1/3 by 1/3 the final cross section, which since it has to have the same energy (actually would need more to compensate for attenuation through air) would be nearly a 10dB increase?
Is 160dB of ultrasound (at 40kHz or 110 kHz) safe... for a few seconds? for a few minutes? a few seconds of exposure, daily? To your eardrum? To your eyes?
Higher energies needed if the phone isn't perfectly oriented. Worse case is small edge-on orientation to the transmitter.
Higher energies needed for less than 100% transducer efficiency, and I don't know what kind of engineering magic they've done for the transducer but what percentage of the energy could a thin skin over a device possibly convert? 80%? 50%?
The beam could be 170dB, or more.
I want to believe, but this is too sketchy without more information. Large companies have been conned out of millions by small teams peddling snake oil. I'd first want to see it demonstrated with nothing but the transmitter plugged into a socket (through a power meter), a phone at a known, low battery level, and nothing else with wires or metal in the room. Then I'd like to see a test of the transmitter aimed (from above) at a glass of water with a visible thermometer, to see what it does to water.
Agreed. You could substitute ultrasound waves with electromagnetic waves in this same conversation. We use EM waves every day for diagnotics -- at the 740-380nm range because that's visible light. 10x-100x frequency of that is ultraviolet, which is harmful. And as for resonant frequencies, it doesn't seem unreasonable to me that certain frequencies could denature certain proteins or have other effects.
I wish the article had a reference to the actual power used. It SOUNDS like it's quiet minimal. In which case, it'd be hard to imagine you are dealing with biologically devastating energy.
That said, the unidirectional nature of this approach also means that actual life that could be affected is a lot smaller than what happens with sonar, where the signal is blasted in all directions at a very high power.
I am not a doctor nor an expert in this area, but I would presume that would be a valid risk if the frequency were not tuned specifically to resonate the tumor prior to increasing the amplitude. And then there is the matter of focus / accuracy. Perhaps nih.gov has a study with numbers to show the attainable accuracy and what statistics are given back to the equipment and the doctor to compensate and tune the signal. My understanding is that the inverse is generally true, in that, there is too much heat and the surrounding area is scarred.
I'm no physicist, but I think that while it may be theoretically possible to distribute power wirelessly doing so in practice seems like it will be necessarily dangerous so as to be impractical. To quote a previous comment I made on HN:
To distribute sound over long distances 'wirelessly' you need to make it loud. That typically means cranking up the power. And ultrasound can be harmful at high power [1].
[1]
"Occupational exposure to ultrasound in excess of 120 dB may lead to hearing loss. Exposure in excess of 155 dB may produce heating effects that are harmful to the human body..."
https://en.wikipedia.org/wiki/Ultrasound
At the present, we don't have enough evidence to be concerned about low-power EM radiation. However, this does not rule out the possibility that in the future, we might find a causal link between that kind of radiation and adverse health outcomes. What we do know at the moment is that this type of radiation can heat tissues. What are the long-term biological consequences of that? We don't really know. My advice: try to minimize your exposure out of caution, but don't get too worried about it.
Hmmm. It depends on the power, frequencies and distance to tissue. I remember a coworker at a GPS mfgr accidentially turned on a 440 MHz radio in high-power (25W) mode with only a whip antenna and received a nasty radio burn when he pointlessly/accidentally put his hand around it. That radio burn looked similar to a sunburn or maybe 20 seconds in the microwave. Increased risk of cancer, for sure.
A fraction of a Watt every now and then might be okay, as long as it’s not bursting at multiple Watts or near the resonant frequencies of purine or water.
> uncertainty about the harmful effects of that radiation
Personally I don't think there is much there to be concerned about. We know these waves penetrate less than 1mm into your skin, and that the only way they can cause damage is by thermal effects. We know it's not ionizing radiation, not by a long shot. I believe getting a sunburn at the beach is far worse.
I think the privacy concerns, and the frequently low efficiacy/high-false-positive rate of these machines are far more cause for concern.
Will they also recommend to opt out of millimeter wave scanners? If we're going to talk about risks of one type of nonionizing radiation, why not that one too?
Unfortunately none of this is actually safe for humans.
Limiting exposures by reference to W per kg is fundamentally flawed as it ignores Various effects of different frequencies of both electromagnetic and electrostatic influences on the body.
Eg very low power levels can do things like stop the process of embryonic development in its tracks by stopping divided cells sliding over each other to move to their correct relative position.
Reducing this all to wattage is a mistake. Radio waves penetrate, and impart their energy to different materials differently (i.e. 80W diffused vs 80W concentrated into specific structures/compounds/reactions). Being inside a strong alternating field for prolonged periods of time could also tip the bias in favor of one biochemical reaction over another, which may have other health consequences over the long term.
It was a worthwhile experiment, and the results aren't particularly surprising.
I opted out of backscatter back in the day (and looking at the facts retrospectively I feel validated in doing so, for example the miscalibrated units found to be spitting out 20x more radiation than designed).
I don't opt out of Millimeter Wave today, as I'm yet to see convincing scientific arguments or evidence that they're unsafe, or what the mechanism would be outside of mild tissue heating. I do choose to close my eyes when scanned though, take that as you will.
I think the argument is that while the total power over the full 4p is equal to the sum of the input power, the constructive/destructive interference cause there to be higher power output within some solid angle. But my understanding is that the limit are for avoiding interference from other devices, not for human safety, and that any reasonable amount of non-ionizing radiation is incredibly safe for humans. Congestion wouldn't be impacted by the beamforming, because on average any increase in intensity at one angle would be countered by a decrease at some other angle.
why would I weigh one more? maybe it's harmless, maybe it has bad effects (it isn't unprecedented for non-ionizing waves to have bad effects on anything)
It's just energy, which can - like any other electromagnetic energy source - be harvested by the right resonant mechanism. The chances of a particular frequency band at such low field-strengths having an impact on a structure not expressly designed for it is super small. It would be like the evolutionary equivalent of finding a 'u-beam[1]' ready receiver in something made in the 1920's.
[1] The fraudulent wireless ultrasound charging company.
Agreed. The primary concern is tissue heating that cannot be sufficiently dissipated. This is nearly impossible at "wifi" power levels(Below 1 watt) without carefully contrived situations -- such as placing your eyeball(which is the least-equipped organ to dissipate heat) at the exact focal point of a specially designed parabolic dish.
I have been safely working around thousands of watts of RF power for 20+ years and have maintained my 20/15 vision.
From Wikipedia:
"Although the long term effects due to ultrasound exposure at diagnostic intensity are still unknown,[24] currently most doctors feel that the benefits to patients outweigh the risks.[25] The ALARA (As Low As Reasonably Achievable) principle has been advocated for an ultrasound examination — that is, keeping the scanning time and power settings as low as possible but consistent with diagnostic imaging — and that by that principle non-medical uses, which by definition are not necessary, are actively discouraged."
reply