Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Apparently the sun outputs radio waves in the THz-range, sub millimeter waves, so i dont think that this is anything to worry about. Also the strength of the radio waves comming from our devices is really weak, not at all in the suns level.


sort by: page size:

Exactly, the sun is mostly higher energy (approaching ionizing) terahertz radiation.

The concern with 5G seems to revolve around the uses of higher frequency millimeter wave radiation compared to 3G/4G which has shown no repeatable damaging results at normal power levels.

If higher frequency = bad, which is actually true, comparing 5G the sun is not dishonest IMO. I am trying to give some perspective to show how it seems odd to worry about extremely low power cellular radiation while giving little thought to the extremely powerful nuclear radiator in the sky.

Dosage matters, cellular frequency will cook you given enough power, so will visible light. The power levels we are talking about do not generate enough heat to damage our tissue, so if they do harm it would need to be through some other unknown process. Should we keep looking for possible other processes, yes of course. However I would be much more concerned about the much more powerful visible range artificial radiators around us every day, like the monitor I am staring at right now emitting a 100 times the radiation of my cellphone right at my face all day long.


Probably nothing.

I know I sound a little cavalier here, but c'mon, if you're going to repeat the "oh noes, radio waves are giving me $TERRIBLE_DISEASE!" conspiracy theory stuff, you gotta provide some peer-reviewed, replicable (and replicated) research to back it up.

We've had a huge variety of radio frequencies shooting through various layers of the atmosphere for generations at this point. Sure, some frequencies have only been around and pervasive relatively recently, and we should be watching for longer-term effects. But there's nothing reputable to suggest that the stuff floating around now is unsafe.

For reference, the stuff coming down from satellites is much lower frequency (usually IEEE L-band; 1-2GHz) than even WiFi (IEEE S & C bands). Sure, power levels are a bit higher -- though significantly attenuated by the time it gets to ground level -- but if WiFi isn't causing problems, it's vanishingly unlikely that satellite transmissions would be dangerous to humans.

To summarize: I'd be more worried about sunburn and skin cancer when I forget to put sunblock on in the summer.


Could be that it's not the sun being the problem, but not having walls. Inside, the radio waves can bounce of the walls, around your body. Outside, they have to go through your body.

The strength of those signals is very, very low. WiFi around -60dBm or lower most of the time and the cellular signal where I go is usually well below -80dBm. For comparison, the power received from Earth's nearest star Alpha Centauri A is on the order of -50dBm. If I'm concerned about WiFi and cellular signals I should be positively freaked out about natural sources of em radiation.

But having a couple watt transmitter pressed against the side of your head might be something to think about. Still, it is non-ionizing radiation and aside from the possibility of localized heating, which you would likely notice (I think?), what's going to cause cancer? Some previously unknown (to me) phenomenon?


Does the sun bath us in RF power across a huge spectrum far beyond what is measured for cellular/wifi/broadcast etc?

50 GHz is still significantly less energetic than visible light, which is in the hundreds of terahertz. The sun bombards us with a lot of visible light, but you have to get into the blue/violet/ultraviolet region before it's energetic enough (high enough frequency) to cause damage. Barring some strange and previously unknown special biological interaction with a particular radio frequency, radio signals which have lower frequencies and lower intensities than visible light from the sun are not going to cause cancer any more than exposure to red or green light would cause cancer.

And the sun emits ionizing radiation, which radio devices do not. And ionizing radiation is known to damage DNA and cause cancer.

Molecules don't resonate at 5G frequencies.. even with 5G mmWave, the wavelengths are about 10,000x too long (millimeters). It's like riding on a cruise ship in the ocean and worrying that the wine glass in your hand is going to resonate and shatter from the ocean waves.

The impact on tissue is still thermal.. and the power flux densities of 5G are small compared to the sun, especially so because RF bandwidths are tiny compared to the blackbody radiation spectrum of the sun.

You would have to show why these RF transmissions are more pathological than the random process EM waves generated by solar blackbody. If you look at a modern cellular OFDM waveform, it is almost statistically indistinguishable from broadband noise. It had the same Peak to average power ratio, the same flat spectral bandwidths, and the same affinity to causing resonances--its just much much lower in amplitude than solar.


Any source for your claim?

BTW, it is completely ridiculous to think the Sun is more dangerous than manmade wireless devices.

Plus, how come there weren't reports of people (self)diagnosed with "hyper-sensitivity" to EM waves, before the "wireless age"? By your claim, billions of people who lived before us must have been felt similar effects on their body, but as far as I know, they were fine.


We're talking about ghz frequencies whereas sunlight is terahertz. Latter interacts with human cells and a portion of it causes skin cancer. Former (non-ionizing radiation) causes a rise in temperature or passes right through.

People tried so hard to find the smallest negative effects of non-ionizing radiation (without p-hacking or silly surveys) and failed miserably. This is one of those rare cases where absence of evidence is evidence of absence.


> is now concentrated in that beam, with huge parts of it absorbed by the skin? How is this not a health concern?

Have you ever been outside? Felt the warmth of the sun on your skin? That warmth is (up to) 1000W/m^2 of ionizing high frequency EM waves being absorbed by your skin.

So in light of that, if you want to be consistent, you should start worrying about non-penetrating EM waves from 5G when you feel them heat your skin up.

Of course, skin-penetrating waves are another story entirely, but the point of the article is that they seemingly don't.


Although we should always be studying potential risks, I have little concern.

Just keep it in perspective, you walk around outside under a nuclear fusion fireball receiving around 1000 watts of terahertz radiation some of it ionizing and known to cause cancer.

Wifi, Cell including 5G are in the Gigahertz range, non-ionizing, which means they can basically heat your flesh like a microwave and nothing more and are at milliwatt power levels from the phones transmitter and maybe a hundred watts from the towers transmitter.

This would be like worrying about what an led flashlight (the phone) or street light (the cell station) does to you.


Sure, ionizing vs non-ionizing but millimeter waves do do localized, penetrating, heating and the intensity of millimeter radiation from 5G is orders of magnitude stronger than that from the Sun.

Is it consequential? Probably not, but the OP's original comparison of 1000 W of Sun vs. 100W from a tower and nW from cell phone is disingenuous.


Yeah, non-ionizing radiation from the Sun is harmful, e.g. when you looking at the sun via naked eyes, or forgetting to use sunscreen. But those are not normally considered as RF.

Sunspots don't cause appreciable increase in RF background at any frequency at sea level. They cause lots more excitation of the ionosphere, so shortwave radio signals bounce better and further, but nothing up in UHF.

But other phenomena correlated with sunspots do cause UHF interference, so while wrong on the details I generally believe the story.


The issue is that the energy absorbed is literally too low to do anything. We're talking about heating your body up by 0.1 of a centigrade, so little that it is only visible when using extremely sensitive equipment in lab controlled conditions.

I'd worry more about standard heat given off by devices like laptops. The radio emissions aren't gonna do much but the 80C hot air out of the exhaust blowing into your lap is definitely going to do something. There is plenty of studies on the effects of overheating genitals in males.

But back to the topic; the energy is too small. If these radio waves had an effect, being out during a cloudy day would. The energy you get from solar radiation over a day is probably about the same.

Things like DNA breaking apart don't generally happen with this low frequency, largely because they require a bunch of energy to happen. Like ionization, even 1000W of 100MHz radio waves aren't going to make it happen. There is no amount of energy you can dump into a 100MHz transceiver that would cause ionization of atoms. It simply won't happen. Well, they will ionize but not because of the radio waves but because they've heated up enough that they're starting to undergo plasma formation. But that semantic difference is still important.

The sender in your cellphone or the energy you get from your wifi router are too small to be measured and by all acounts of physics, their effects are too small to be reasonably measured even in long term studies. We're talking literaly sub-milliwatt over your entire body. The entire body uses atleast 100Watts to stay alive. <0.1% change in thermal energy.


Is it generally accepted that these wavelengths and devices that operate them are safe around humans? Should I feel concerned at all having so many large antennas and amplifiers and what not near me?

I never really stopped to ask this of my phone either, I guess.


It’s non-ionizing radiation so would “only” cause radiation burns worst case. I don’t know what frequency the transmitter uses / how deeply the waves can penetrate, so you could also end up with “internal sun burn” as well external burns depending on the power and penetration.

Sunlight is ionizing radiation -- the ultraviolet light is capable of knocking electrons off their atoms, damaging nearby biological systems. 2.4 GHz radio waves cannot do this; absorption just causes local heating (in this case, truly infinitesmal amounts).

I dont know whether that meets the standard of convincing you, but it is fantastically well established science.

next

Legal | privacy