Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It would also fry your electronics and be incredibly noticeable and very easy to prove.

Also, 10 000W wouldn't be enough because that power isn't into a single resonant cavity, we'd be talking about pulses with instantaneous power an order of magnitude higher at the very least.



sort by: page size:

I don't want to sound like a neophyte here but I'm afraid of beams that carry that much power. I have no problem at all letting the milliwatt beam of my cell phone go through my head but lets just say I don't my bedside alarm clock powered by this just yet.

Coupled resonance seems safe to stuff that doesn't resonate in between but when I hear beam and watts I still freak.


It doesn’t take THAT much power. The paper says effects took an average power of ~1 mW/cm^2, with a 0.0001 duty factor. That’s not much from a heating standpoint, but extremely strong for detectability. That average power is about what your phone delivers to your head.

The peak power is massive, but it doesn’t last long, so the energy is weak. To get the equivalent peak power, you’d have to put 10k phones against your head.

If you put your head in a 1 kW microwave oven, and pulse modulated the magnetron, your could maybe get the effect. It would make an entertaining YouTube video.


11kW in the 1GHz-1THz range, in a location where a civilian bystander could accidentally move himself into the beam, isn't going to happen. Not even 1kW.

If you look at a microwave magnetron (don't), you go blind in seconds from your eyeballs getting cooked and turning opaque. Surgery could theoretically replace them (and leave the retina), but I'm not aware of that ever happening.

It's quite feasible for a sizable eNodeB to consume 11kW, but keep in mind that they have significant amounts of compute hardware.

It'd help if you'd make a bit more of an effort to check/validate/cite the concrete numbers you name, if you don't want to be seen as a "confused anti-5G person". I don't think you are malicious, considering your history, but no everyone will.


but ... penetration and frequency. They are important too. Of course that only means that 1000W of microwave has other effects than 1000W of light, but I'm pretty sure that both 100mW of microwave and 100mW of light are pretty insignificant.

Your home router is emitting 0.1 Watt of 2.4 GHz radiation, your microwave uses the same frequency at 1000W.

Overheating your brain with 10000W nanosecond pulses could have very strange health effects...


Also, I'm pretty sure you will not want a 10 watts microwave emitter couple of inches from your brain

Well, you could definitely burn yourself with a powerful mm wave generator; but I'm not sure if that really says anything about safety.

Your home router is emitting 0.1 Watt of 2.4 GHz radiation, your microwave uses the same frequency at 1000W.

Overheating your brain with 10000W nanosecond GHz pulses could have very strange health effects.


> I'm guessing this is more of a "we're heating the electronics beyond their operational capacity" than "we're inducing surges in the traces and overloading circuitry" kind of killing.

I suspect it's the opposite: very high-power but short duration microwave pulses that will induce sufficiently high voltages to blow insulation somewhere important in the target device. I don't know enough about antenna/dish design to estimate what frequency band it operates in and the $15M price tag is low enough to rule out some of the really advanced microwave sources, but we could still potentially be looking at multi-MW pulses.


My point was that it is ridiculously easy to get it all wrong and suddenly achieve unintendedly strong radiated power. Again, it was a crude example of a thought experiment with a wide, wild number of variables to be considered.

Also you'd get energy beams being fired around your living space. Seems a bit iffy to me that you could do this without frying random things, biological and otherwise.

> is it possible for someone to burn you from inside by emitting the right microwave frequency with enough power?

I guess this is always the case. No need for chips.


It already astounds me greatly how much common people don't know the basics about EM fields when we've essentially been immersing ourselves in them for over 100 years.

The main danger is power level causing heating/burns - and a microwave operating at 1000W+ is going to do that far more easily than a router with 1W at most.


It is not ionizing radiation, the primary hazard of microwave radiation is its heating effects on sensitive tissue that has poor blood circulation -- in particular it has been implicated in the development of cataracts in the eye. So to a first approximation, the few milliseconds of 1000W output is really only as hazardous as a few minutes of the typical leakage you get with the door closed (somewhere on the order of 5 mW).

Curiosity. Why not? It’s not inherently dangerous. I’ve had real RF burns from tuning high power microwave amps.

The RADAR tick is EE folklore, from your brain expanding a few microns at the pulse rate. As I said, takes a huge power density to do this.

Cataracts are an issue with long term exposure. I guess we’ll know in the near future if this group of people has a higher incidence of cataracts.


Yes, it is obviously anecdata. N=1 and all that. But trust me, 1KW is nothing to sneeze at, the fluorescent tubes in the building I was in would light up spontaneously, as would those as far as I could see wherever the antenna would point. A couple of blocks at least. So if there was a direct effect and not just some very low statistical one you would expect to see a case like that register. This is not unlike smoking where chugging 4 packs per day tends to have consequences. And then of course, everybody has that great-uncle who did smoke 4 packs per day and still runs the half marathon at 85. Maybe I'm that guy, but I think that the whole EM->cell or DNA interaction is a load of bunk until you get into extreme power at very short range.

A KW of HF when you touch it or less than that at GHz+ ranges and you will do real damage to tissue, I have the scar to prove that one.

The induced power falls off very rapidly with distance, especially for omni-directional radiators. The 120' antennae that are common on cell phone installations have relatively low efficiency and 100W or so of radiated power per antenna, maybe 105W at the foot of the mast but unless there is some kind of giant impedance mismatch and if there is then you'll just fry an end stage.


> On the other hand, having a 400 kW EIRP source anywhere nearby should be immediately obvious to a NSA or probably anyone with almost any radio receiver, the interference would be terrible.

This was exactly my thinking, somebody in the local area surely would have noticed this thing was turned on. You're not going to dump 400kW of energy without screwing with something.

> So yes, this indeed exceeds health limits (in .cz: 50 W/m^2 for workers, 10 W/m^2 for civilians) and is kind a lot.

Is that an enormous difference? It's three times health limits - but how long would you need to hold it there to get health problems? My guess is quite a while to fry your noodle?


I guess my hangup here is that it seems like a <10W (with a raspberry pi's GPIO and no amp, we are probably talking <1W in reality, but throw on a factor of 10 for safety) RF circuit with an antenna a few cm long seems really easy to make by accident, let alone on purpose.

It is illegal sure, and these people shouldn't be doing it intentionally, but I have a hard time seeing it causing damage measurable in actual dollars.


It's possible that better shielding on the device would be more of a curse than a blessing. It would increase device size, and possibly cause absorption of the waves that would heat it - leading to serious brain damage.
next

Legal | privacy