Wi-Fi has become such an ingrained part of our everyday lives that we tend not to give it much thought unless it has stopped working. But what if your family has a newborn baby in the house? Are there any dangers that new parents should be aware of?
Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.
SuperUser reader avy wants to know if Wi-Fi could actually be harmful to his family’s newborn baby:
I am most likely being an overprotective parent, but since the birth of our newborn, my wife and I have been wondering about credible studies dealing with Wi-Fi and health concerns. I love my Wi-Fi, it’s the cornerstone to all my gadgets and computer setup throughout our house, and it makes my world easier plain and simple, but having a newborn enter that world changes the way I think about everything.
Now before people start writing that Wi-Fi is safe because they use it in hospitals and schools, let me be clear, I’m aware of all that, but the idea of having it 24/7 for years to come around this little person that is our responsibility to look out for makes me want to have a definitive answer to the subject.
I will put on my tin foil hat and await some well thought out/educated answers.
Could Wi-Fi pose a danger to a newborn baby or is it just a bit of unnecessary paranoia?
SuperUser contributors NothingsImpossible and Bob have the answer for us. First up, NothingsImpossible:
Disclaimer. This is a very simplified explanation, mistakes are (mostly) intentional.
Radiation can be separated into two categories: ionizing radiation and non-ionizing radiation.
In layman’s terms, ionizing radiation is radiation that can “break” the molecules that make up things.
Non-ionizing radiation, on the other hand, just passes through objects or is converted to heat when it hits them.
Wi-Fi networks operate on the same frequency as a microwave oven. It uses non-ionizing radiation and when it hits the objects, it is just converted into heat, it does not change the composition of the object itself. It is harmless, at most it will heat your body, but a very, very, veryyyy tiny amount that is not even measurable.
Ionizing radiation is dangerous. Examples of it are ultraviolet rays and nuclear radiation. They not only heat you, but they also change the composition of the molecules that make up your body. They can modify the DNA of your cells, causing cancer.
Example: Sunburns. It burns after long, unprotected exposure to the sun, not because your skin got hot. The UV rays of the sun damaged the DNA of the skin cells, and the body reacts with the burning sensation.
Conclusion: Wi-Fi is harmless.
Followed by the answer from Bob:
The term “radiation” is often used to scare people. Let’s get it straight. There are two factors – frequency and intensity. Frequency has a far larger effect on how damaging radiation is. Wi-Fi and other radio communications use a very low frequency – far below visible light.
Radiation that actually causes issues, could potentially cause cancer, etc., is usually ionizing radiation. It has a very high frequency and can cause mutations in DNA, possibly leading to cancer (more info on that process). The frequency required to be ionizing? At least 1,000,000 GHz. That is literally a 500,000 times higher frequency than what Wi-Fi transmits on, 2.4 GHz or 5 GHz. Non-ionizing radiation, which Wi-Fi falls under, does little more than transfer heat.
Did you know light is also EM radiation? Yup. In fact, light (~500,000 GHz on the near-infrared side, ~750,000 GHz near-ultraviolet) is much closer to ionizing radiation than Wi-Fi. Sunlight actually contains some ionizing radiation (UVB, UVC – UVA can also cause DNA damage, but it’s not in the same way). But you’re not going to hide in your house for the rest of your life, are you?
Apart from frequency, there is intensity. Non-ionizing radiation can also be damaging – but this really only applies to higher intensities. And ionizing radiation is not always dangerous – our bodies can cope with lower intensities, which is why we don’t all die in the sun (vampires are another matter). Wi-Fi has a transmitting power usually far under 1 Watt (I’ve seen figures for 200 mW). And most of that energy never reaches you – by the inverse square law, you only get about 1/distance squared of that. In layman’s terms – the energy spreads equally in all directions. 10 meters away? 1/100 * 200 mW = 2 mW. That’s nothing.
Microwave ovens (which operate on a similar frequency as Wi-Fi) transmit ~1000 Watts, and it’s highly focused inside that metal box. Only maybe 1 Watt can be released through the shielding, and even that is considered perfectly safe. To put all this in perspective, sunlight (which is a higher frequency, and therefore more energetic) is about 1000 Watts per square meter when it hits the ground, half of which is visible light or higher.
You might also find some interesting sources and studies cited on a similar question on Skeptics.SE.
Make sure to look through the rest of the lively discussion on the topic at SuperUser via the link below!
Have something to add to the explanation? Sound off in the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.
- › Audible Now Has Audio Books With Dolby Atmos
- › The New Kubuntu XE Could Be the Linux Laptop for You
- › Grab an Anker 65W USB-C Charger for Just $35 Today
- › Windows 11 Will Make USB4 Easier to Troubleshoot
- › YouTube Music vs. Spotify: Which Is Better for Streaming?
- › Lenovo’s New Gaming Laptops Have Too Many Options