Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
2450MHz radiation induces stress and anxiety like behavior in rats (www.sciencedirect.com) similar stories update story
2 points by pitched | karma 616 | avg karma 2.61 2021-10-05 19:38:24 | hide | past | favorite | 96 comments



view as:

I have a small apartment and realized my WiFi router power levels are by default set to 100%. I dropped it to 25% with no meaningful reduction in bandwidth. It's a simple precaution and while I don't believe I am sensitive to RF fields maybe my neighbor's children are.

The issue that the ISP doesn't want complaints so those routers are always set to auto-hop at maximum speed.

Since most apartment complexes use the same company, the hops are always around the same time so you basically have these routers screaming at maximum speed and moving between channels.


What are you talking about? Your wifi router isn't jumping channels unless you have something fairly sophisticated (read auto channel assignment that recalibrates periodically when nobody is connected).

Doesn't most consumer wifi routers do that by now by default ?

I don't know if it hops when nobody is connected but I certainly found mine to hop while I'm connected. It'd hop around until it finds a channel where wifi doesn't work, and then stays there.

I had to force a channel..


Scanning for least congested freq is very common, but it only performed right before upping of AP iface and no else. So sweeping the band with AP is very impractical.

some of the higher end routers have an extra radio specifically for scanning. Ie: see Cisco

https://meraki.cisco.com/product/wi-fi/indoor-access-points/...

This thing will scan channels, as well as perform security operations.


enthusiast (eg. vanilla-openwrt) and enterprise gear won't, but soho and consumer routers totally do, because in a high density, non-cooperative environments (apartment) it has some potential to make it suck less and ppl like and pay for them "smarts".

Right, but you can't frequency hop without disconnecting most clients. Generally speaking most modern clients are connecting on 5GHz anyways; it's just some generic things like wifi scales or legacy devices that might only be in the 2.4 spectrum.

Also, I thank you on behalf of your neighbors for reducing Wi-Fi interference. If only everyone would reduce power to the minimum needed to cover their living space, interference would be less and performance would be better.

100mW (19.8dBm, as per TFM) up close, delivered direct by a waveguide (as per TFM) is a whole lot of energy for something the size of a rat. Hardly a surprise that it would have an effect. Add more power and you can cook them.

I have 30 wifi around me, to which I'm exposed 24/7. There is something to be said about long term effects even if they aren't as high power

but .... inverse square law ..... before you worry about wifi worry about the one known source of EMR that makes bigger fatter photons that do provably cause cancer .... you know the one, in the the sky, go out when it's visible and you can actually feel the photons

I think there is a significant fraction of HN who would be worrying about their primary method of EM exposure if they worried about wifi instead of the sun (including myself until I got a dog).

I also found out that brain cancer on right side has increased, likely due to cell phone use as most people are right handed. And that rats got brain cancer. And didn't california release health guidelines

https://www.cdph.ca.gov/Programs/CCDPHP/DEODC/EHIB/CDPH%20Do...

while radio freq wont cause ionizing radiation, it seems they are still capable of biological changes. I read they cause ions imbalance or something in cells.

I'd like to really see a study done - have 50 kids take an SAT in faraday caged room and other 50 in general open space with routers, cell phone towers etc. I really wanna know if it affects your thinking.


All the potential health hazards, excluding brain cancer, may be related to high stress life that makes you use your phone a long time.

About the brain cancer thing, I highly doubt cell phone radiation could do anything significant as it is very low power.

Also, the study you linked only talks in vague terms like "may", "could", "still evolving", etc.

To me, it looks like a document put out to assuage some section of the electorate.


but ... penetration and frequency. They are important too. Of course that only means that 1000W of microwave has other effects than 1000W of light, but I'm pretty sure that both 100mW of microwave and 100mW of light are pretty insignificant.

Depends on the power density. A 100mW laser is seriously dangerous, it can cause permanent eye injury in an instant.

What is the power output range of those 30?

How far are you from them?

Is there anything between you and them?


what exactly is a waveguide? How much stronger is that compared to something like a directional antenna at 1W rubbing your face?

It is a pipe that moves radio/light/sound waves from one place to another, usually between where they are created and where they are being used such as between transmitter and antenna or, in this case, between transmitter and rat in a cage. They look like normal pipes but inside have internal structure (ridges) designed to keep the waves propagating with minimal loss. In theory, a fiberoptic cable is a waveguide for light.

https://en.wikipedia.org/wiki/Waveguide


> In theory, a fiberoptic cable is a waveguide for light

In practice too. Your statement makes it sound like its an exotic property.


The difference with optics is that they are solid, unlike sound/RF waveguides which are hollow. A fiberoptic cable just doesn't seem very pipe-like and isn't normally called a waveguide.

seems like a flaky distinction

A waveguide as it pertains to RF is just an enclosed hollow metal pipe that can carry RF energy. The size of the tube determines the ideal frequency it can carry (without excessive losses). I believe at 2.4GHz, a common waveguide is roughly the diameter of a tennis ball cannister, so it's probably easy to put a directly a rat inside the waveguide (I can't read the article so I'm not sure if they do this). In theory though, there is no reason a directional antenna couldn't be used to deliver the same amount of power, the waveguide might simply be more convenient.

In the early days of widespread WiFi there were quite a few blog posts documenting the construction of directional antennas using Pringles cans.

Directional antennas come in different forms each with slightly different characteristics sonit's very hard to say. Your also talking about a lot less mass in a mouse than in a human head. With the wave guide at close enough range, you can assume a much higher percentage of the energy is hitting the mouse than say a yagi, but the specifics depend a lot on the design of the system.

To answer the other part of the question, you can think of a wave guide like a tuned pipe for rf to travel down. In alot of cases they are literally just empty tubes that are correctly tuned for the frequency fed into them.


Is heating the mechanism of action here?


Microwave ovens, the other source of 2.4GHz. There's probably a reason they used 2450MHz as the test frequency, a magnetron from an oven is cheaper and easier to obtain.

https://hypertextbook.com/facts/1998/HowardCheung.shtml


They also used 900 and 1800MHz.

Some say Havana syndrome is caused by microwave radiation.

And some say that it might be caused by very loud crickets. Because none of the electronics in the area died.



The sun also emits microwaves, so stay out of the sun.

As someone with sun allergy, yes, I do stay out of the sun. Being exposed without ample sunscreen for even three minutes can have a pretty immediate effect. Of course, that’s due to UV, not microwave.

This is a 2019 study. Would be more interesting if anyone else had managed to replicate it.

Reading the study doesn’t inspire a lot of confidence in their methods. They make a lot of basic errors like listing dB figures (a relative term) when I think they mean dBmw (an absolute measurement).

They tested multiple frequencies but for some reason only 2450MHz showed significant effects. 900MHz and 1800MHz showed no effects, despite 1800MHz being relatively close to 2450MHz.

They also didn’t use very many rats for each group. Only n=6 per frequency, which were further divided into n=3 for some of their measurements because they couldn’t use the same rat for both tests. They also chose a lot of stress-related measures which are known to be heavily influenced by mishandling the rats (e.g. if a researcher accidentally dropped one during handling)

Unless someone can replicate these results and do it with more rats, I suspect the results are likely from one or maybe two rats having abnormal baseline results and that being amplified by the ultra-small sample size.

As for relevance: Unless you’re standing in close proximity to a high-gain, directional antenna emitting continuous wave tones (not intermittent modulated WiFi packets) then your WiFi exposure won’t come anywhere near what they used for these experiments. Inverse square law and omnidirectional antennas mean that very little WiFi energy even makes it to people near a WiFi router.


Water has a strong absorption peak at 2450 MHz (it’s what kitchen microwaves use).

A microwave oven is 10000 times more powerful than a typical WiFi signal and is focused on an object less than a foot away.

From what I can find, it looks like the nearest absorption peak is around 20GHz, and while it is temperature-dependent, that peak doesn't shift all the way down to 2.45GHz.

This is not really true. While the microwave absorption of water is significant at 2.4 GHz the peak is (a) quite broad and (b) centered at 10GHz or higher (higher if the temperature is raised); see plot in [0]. So ascribing some "specialness" to this frequency due to the abundance of water in living systems isn't an explanation.

[0] https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_...


To expand, apparently microwaving food right at the absorption peak sucks because of low penetrance, heating only the top layer of the food. The reason 2.4 GHz was chosen was purely as that is where the ISM band already was at the time.

i’m trying to google a better answer, it seems ISM bands were chosen when microwaves were a nascent technology, so it’s a bit chicken and egg

Yeah, I do not think I have a good source at hand, this is a fun fact from an E&M lecture we had.

I also wonder if anyone actually experimented with microwaving their dinner at 10GHz or if that was just an educated guess. X-band magnetrons for radars are widely available though, so I would not be surprised if that was coming from practical experience...


A review of the history of microwave cooking implies that the interaction of 10 GHz microwave radiation with water was understood by WWII.

"effective penetration and heating favors wavelengths in the cm range (1–30 GHz). Higher frequencies cannot penetrate sufficiently, and much lower frequencies require very high fields, and can have poor coupling unless making use of direct contact or inductive heating. In the years following WWII, emphasis was thus placed on frequencies in the low GHz region, where both the historic development of microwave power sources and serendipity converged."

https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9393469


Not really an answer, but here's some context...

In 1952, a paper called Tubes for dielectric heating at 915 megacycles said this, "Dielectric heating can be much faster at microwave frequencies. Because leakage radiation is large enough to interfere with communications, heating is best done at assigned frequencies. The bands at 915 and 2,450 megacycles are most useful."

https://ieeexplore.ieee.org/abstract/document/6371890


Ah, OK, 1947.

"The plan which is, recommended by the United States provides for certain discrete frequencies, with appropriate guard bands, for those industrial scientific and medical devices which cannot be shielded effectively. This assures a minimum disruption of vital communication services. The allocation of frequencies for this purpose is justified because of the tremendous investments in equipment and the public demand for the various services performed, together with the harmful interference to radio communications which would result if such devices were scattered indiscriminately throughout the spectrum. The frequencies proposed are:

13660 kHz

27320 kHz

40980 kHz

2450 GHz"

http://handle.itu.int/11.1004/020.1000/4.62.51.en.108, page 628.


I did much of this archival research (and wrote the Wikipedia section) let me know if you have questions.

2.6 was chosen based on empirical heat penetration testing for typical target food sizes, e.g. Potatoes.

GE lobbied for 2.4 instead so they could emit a noisy harmonic from a crystal based 1.2 system so the FCC agreed.

Source: national archives documents I researched


Cool! I came across some quoras that suggested the electronics were easy for 2.4, but didn't give further detail. Noisy harmonics does sound useful for cooking!

microwaves use it because water absorbs it + multiple wavelengths can fit inside the oven for even heating + easy to produce without fancy electronics

water can get shaken up by a wide band of frequencies


Microwaves could have used almost any frequency that was cost effective, 2.4 was a compromise to between competing companies and empirical testing.

This is a peer reviewed article right? So presumably at least a couple other people in the field were satisfied with their methods. Are you suggesting they were wrong to accept this article?

> This is a peer reviewed article right?

So is this [0].

[0] https://juniperpublishers.com/ofoaj/OFOAJ.MS.ID.555850.php

"Peer reviewed" does not necessarily mean high-quality or true.


OMG - thank you! That is superb!

"Finally, we would like to acknowledge that this study would not have been possible had it not been for the predatory journal industry. Without it, academia and society would be a better place. "

Do you remember hidroxycloroquine? Just count the referred peer reviewed studies: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7534595/

The quality of the review process correlates with the prominence of the journal. Neurochemistry International is deep in the second quartile of Cell Biology journals. Those kinds of papers need to be read with extra care.

Not who you're replying to, nor an expert in the field, but yeah. This is at most an exploratory experiment that requires replication.

The fact that no rat behavior change was reported until the 3 week mark heavily suggests external factors; hardly a cumulative effect as any long exposure effect would display.

Moreover an interesting paragraph:

> The average rectal temperatures of rat were observed to be 33.71±0.56 (control), 35.3±0.63 (EMR-900 MHz), 35.48±0.67 (EMR-1800 MHz) and 36.96±0.55 (EMR-2450 MHz) before and 33.91±0.82 (control), 35.5±0.51 (EMR-900 MHz), 35.66±0.60 (EMR-1800 MHz) and 37±0.78 (EMR-2450 MHz).

Seems to indicate the 2450 MHz rats were ill or otherwise stressed (rat body temperature hovers around 35.5, though it can increase at night), but no verification or mention is done on this whatsoever. I honestly believe many of effects described in the paper can be explained with the body temperature differential.

Also worth mentioning:

> The animals after being exposed to 900, 1800 and 2450 MHz EMR, the average power density was found to be 0.1227 W/m2. While the overall value of whole body average SAR was found to be approximately, 0.0227, 0.030 and 0.0616 W/kg.

Again not my field but I doubt 120mW/m2 can result in 50% absorption (61.6mW) per kg of rat. That means 2 kg of rats would cover absorb 100% of the radiation, while obviously not covering a full square meter. Please correct me if I misunderstood something.

I'm not saying that 2.4GHz radiation has no ill effects, there's tons of research on rats since the inception of the microwave, but it generally show changes at higher (and certainly more clear) SAR levels.


There's more peer-reviewing on HN, than the actual peer-reviewing.

Peer review in and of itself is of limited value, and usually reveals cosmetic, as opposed by structural, errors. How many code reviews can be summarized by the letters "L", "G", "T", and "M"?

Replication matters. The echo chamber of peer review much less so.


> usually reveals cosmetic, as opposed by structural, errors

I agree, science has become a form of writing and the peer-review will most likely figure out if you are familiar with that form.


Most actual peer-reviewing is bogus, facile comments, by people not even bothering to read the paper (when they're not in cahoots with the author and journal).

> when I think they mean dBmw (an absolute measurement).

I think you may mean dBm or dBmW here. 0dBm == 1mW.

The SI symbol for Watt is ‘W’ not ‘w’.

At 2.4GHz, free space path loss is -41dB at 1m, so even a 20dBm radio and a 20dBi directional antenna (and assuming 0dB of insertion loss), the signal strength at 1m is -1dBm, or about 0.79 mW.

At 3m, the free space path loss is nearly -50dB. Now the signal from the setup above is at -10dBm, or 0.1mW.

https://ecfsapi.fcc.gov/file/10417871820142/2018%2BRF%2BRadi...


1800MHz is not that close to 2450MHz. Behavior in air is different, for one, and the properties that dictate this, the vibrational modes of atoms and so on, are extremely selective.

Not just atoms, you could be hitting something specific in terms of resonance at 2.45ghz.

Maybe rats skull size is perfectly resonant at that frequency, who knows.


2,45GHz ~= 12,2 cm wavelength

resonance seems plausible


2.4GHz resonates with water. Thus why it is also used in microwaves to heat our food!

An urban legend. Microwaves cause heating across a wide range of frequencies.

Not really:

Consumer ovens work around a nominal 2.45 gigahertz (GHz)—a wavelength of 12.2 centimetres (4.80 in) in the 2.4 GHz to 2.5 GHz ISM band


"Microwaves" as in "the part of the electromagnetic spectrum we call microwaves".

And as they correctly point out, that consumer ovens use a specific frequency has nothing to do with it being a resonance frequency of water, but it being a reasonable to use and legal frequency to do so. They would work on 1 GHz or 5 GHz too.


>"Microwaves" as in "the part of the electromagnetic spectrum we call microwaves".

The parent wasn't speaking about "the part of the electromagnetic spectrum we call microwaves" in general, but about what's "used in microwaves to heat our food", i.e. microwave ovens.


I'm fairly sure they were speaking about the meaning of the word were their assertion is correct and makes sense. The urban legend part is "2.4 GHz is choosen because it's a resonance frequency" after all, not "2.4 GHz is what's used in microwave ovens".

Yes I meant EM waves here, not microwaves which can be taken to mean the devices.

Check my submission history for my creds in this regard.


This is an Australian affect. We call microwave ovens microwaves.

Have they controled for coil whine and similar (inaudible to humans) artifacts? Might explain a difference in frequency "response".

There are also the reports about Wifi affecting plants.

https://www.eurekaselect.com/141391/article

But "Wifi allergies" seem like a myth.

https://onlinelibrary.wiley.com/doi/abs/10.1002/bem.20536


Just switched my wifi channel! 6 and below are rat safe.

https://en.wikipedia.org/wiki/List_of_WLAN_channels


Rat Lives Matter (more)?

obviously Rat Lives Matter Most

bluetooth operates at ~2400mhz. i wonder if any animal studies were done before wireless earbuds were made commercially available.

sure. may not have been published though.

Just a reminder that if 20 interested things are studied, on average you should expect 1 of them to be a false positive with p<0.05. It's a bit like throwing darts: it helps if there exists a relationship, but it also helps to be lucky.

relevant xkcd: https://xkcd.com/882/


In theory, it's 1 in 20. In practice, it's much, much higher, closer to 2 out of 3: https://en.m.wikipedia.org/wiki/Replication_crisis

A fancy way of saying that microwaving rats makes them warmer and more comfortable. Now where's my $100K research grant?

You’re an idiot..

Oddly enough, microwaving frozen mice was one of the very first non-radar applications of the microwave magnetron.

https://www.youtube.com/watch?v=2tdiKTSdE9Y


My father used to serve in the Russian strategic rocket forces in the 70s (the nukes basically, which, ironically, would then be pointed straight where I sit today in the US, since we have some decommissioned ICBM shafts straight in the middle of the neighborhood), and they did a much cruder version of these experiments for "fun". They'd kill rats by throwing them into the path of a particularly powerful and massive radar beam. I doubt rats felt comfortable or particularly non-anxious, however, seeing how they'd be dead before hitting the ground after that.

Even if it was proved beyond doubt that it adversely affected people, nobody would want to hear it. It took half a century to make the case for smoking - and here it's even less immediate cause/effect, and with far larger industries and conveniences assosiated.

As network engineer I can reliably state what rats are showing no discrimination when chewing twisted-pair cable segments, should it connect 2.4 or 5 GHz APs, should it run Fast or Gigabit signalling, is it inline-powered or not.

I wish 802.11bg had any vermin repellent capability, but its not.


Maybe they’re attacking to make it stop.

Legal | privacy