Everything gives you cancer ...

FluffyMcDeath said:
Glaucus said:
I think I've read something on that before. On the other hand it seems that radiation of this kind is very bad for animals. Perhaps having to worry about animals less is among the things that are good for plants.
You may be right about the radiation being not good for mammals, but they probably can't read the Russian warning signs either. Unless the levels are so bad that they drop dead instantly, they'd still have time to get in and have a bite to eat.
Ah, but what radiation are we talking about. Alpha, Beta and Gamma radiation have been a part of the environment since there was an environment. Microwaves, not so much.
Good point.

Solar_Spectrum.png


I have no idea how much radiation a wifi router emits, but it would be curious to compare it to the chart above. It looks like we're getting around 0.1 W/m^2/nm in the 2.4Ghz range on the sea level and maybe 0.2 if you live higher up. No idea how a wifi router would compare, but according to the following article, it's probably no more then 1W, total.

Built On Facts - WiFi and Radiation

And the total power output by a WiFi transmitter is many orders of magnitude less than a microwave oven - 1 watt tends to be an upper limit for home and business transmitters, while any person standing around would only absorb a tiny fraction of that tiny fraction. The temperature increase from absorbing WiFi signals is not measurable, and mathematically speaking is itself dwarfed by other radio/microwave sources such as cell phones and (depending on your location) broadcast radio and TV.
 
Glaucus said:
I have no idea how much radiation a wifi router emits, but it would be curious to compare it to the chart above. It looks like we're getting around 0.1 W/m^2/nm in the 2.4Ghz range on the sea level and maybe 0.2 if you live higher up.
You're off by 3 orders of magnitude, I think. The IR part of the chart tops out at about 100 THz. Wavelength of 2.5GHz is around 10 cm.
No idea how a wifi router would compare, but according to the following article, it's probably no more then 1W, total.
If the background radiation was high enough to be comparable then it would be a pretty useless wavelength to do much with - too much noise to signal.

And the total power output by a WiFi transmitter is many orders of magnitude less than a microwave oven - 1 watt tends to be an upper limit for home and business transmitters, while any person standing around would only absorb a tiny fraction of that tiny fraction. The temperature increase from absorbing WiFi signals is not measurable, and mathematically speaking is itself dwarfed by other radio/microwave sources such as cell phones and (depending on your location) broadcast radio and TV.

That site is still talking about ionizing radiation damage to DNA. The kind of mechanism that has been of concern is subtler than that. Breakage of the DNA is not an issue but resonant effects that cause sections of DNA to unzip can subtly alter transcription (and thus the chemistry of the cell) and leave unzipped DNA vulnerable to chemical attack (from free radicals). Pointing out that the mechanism of damage in the case of ionizing radiation is not applicable is valid, but is not the same as saying there is no effect. It would be like claiming that a knife can't be used to kill someone because it cannot fire bullets.
 
FluffyMcDeath said:
That site is still talking about ionizing radiation damage to DNA. The kind of mechanism that has been of concern is subtler than that. Breakage of the DNA is not an issue but resonant effects that cause sections of DNA to unzip can subtly alter transcription (and thus the chemistry of the cell) and leave unzipped DNA vulnerable to chemical attack (from free radicals). Pointing out that the mechanism of damage in the case of ionizing radiation is not applicable is valid, but is not the same as saying there is no effect. It would be like claiming that a knife can't be used to kill someone because it cannot fire bullets.
Isn't WiFi non-ionizing radiation?
 
faethor said:
Isn't WiFi non-ionizing radiation?
Yes, which is why it is redundant to say that it doesn't pack enough energy per photon to break chemical bonds (as that would practically be the definition of ionizing radiation) but that is not the mechanism of damage that is being claimed.

Now, I could try paraphrasing what I said before but right now I'm not sure whether you typed the above as an objection to what I said or an ironic agreement. So I'll just hold off until I get clarification.
 
FluffyMcDeath said:
faethor said:
Isn't WiFi non-ionizing radiation?
Yes, which is why it is redundant to say that it doesn't pack enough energy per photon to break chemical bonds (as that would practically be the definition of ionizing radiation) but that is not the mechanism of damage that is being claimed.

Now, I could try paraphrasing what I said before but right now I'm not sure whether you typed the above as an objection to what I said or an ironic agreement. So I'll just hold off until I get clarification.
It was what it was, a question. I was confused by the wordiness and what you were trying to say.

The other thing to remember is the inverse square law. Exposure to said radiation, because there isn't much power here a couple watts at most, is quickly reduced with some pretty small distance. It'd be interesting to see the study and see how well other environmental factors were controlled: such as natural radiation emissions from soil, changing climate, and the various other growth factors such as quality of the soil, cleaniness of the water,and exposure to the sun.

EDIT: The Telegraph has quick break down of this item along with links. 30 studies found no link to WiFi. About 6 studies did find a link but their results were unable to be repeated, read non-verifiable. And the news event appears to be about a paper that's not yet published for review.

Guys it's okay use your WiFi the trees are fine.
 
FluffyMcDeath said:
You're off by 3 orders of magnitude, I think. The IR part of the chart tops out at about 100 THz. Wavelength of 2.5GHz is around 10 cm.
You are correct there, I totally misread that chart. i was thinking Mhz not wavelength, despite the BIG BOLD LETTERS telling me otherwise! Doh!

If the background radiation was high enough to be comparable then it would be a pretty useless wavelength to do much with - too much noise to signal.
You're right about that too. I guess that's why such low wattage is required for it to work.

That site is still talking about ionizing radiation damage to DNA. The kind of mechanism that has been of concern is subtler than that. Breakage of the DNA is not an issue but resonant effects that cause sections of DNA to unzip can subtly alter transcription (and thus the chemistry of the cell) and leave unzipped DNA vulnerable to chemical attack (from free radicals). Pointing out that the mechanism of damage in the case of ionizing radiation is not applicable is valid, but is not the same as saying there is no effect. It would be like claiming that a knife can't be used to kill someone because it cannot fire bullets.
You're correct about the ionizing radiation damage not being applicable, but I think the theoretical unzipping of DNA is associated more with terahertz (submillimeter) radiation, which begins at around the 300Ghz range, far higher then the 2.4Ghz/5.8Ghz of wifi, which falls in the microwave range.
 
Back
Top