Paul Ben Ishai
Just muddling through.......

The sorry story of cell phone radiation exposure — how did we get here? Part I

This is going to be a long post, so I will split it into two parts:  Part 1 and then its thrilling sequel!

Let me describe my day.  I wake up at 6:15 to a nagging tune, which I duly ignore until I can no longer do so.  Dragging myself through a shower, I dress and I am downstairs in the kitchen.  Making myself coffee and feeding the animals (I do mean animals, a dog and a cat, not my children), I read the news and my emails, WhatsApps, etc…. Then I drive to work, answering calls (never making them) on the speakerphone.  Arriving at work I check my schedule and make my day.  Emails, documents, Skype, Zoom, Microsoft Teams, teaching and meetings.  Much of this activity is ‘on the go’.  The end of the day I am at home and still checking social media, replying to friends or family.  I try to go to bed the same day that I got up.

My lifestyle is typical and possible only because I, like you, live by the cellphone.  And this is the paradox of our modern life.  We cannot go back to the time before. To the calming routine of a Newspaper printed on paper, to a leisurely cup of coffee accompanied maybe by the radio, to a quiet commute and the certainty of a day ruled by a schedule and meetings only.  I doubt very much we would want to. We are married to our phones.

It is a hectic life and it takes its toll on us.  We have “morphed” (I wouldn’t say evolved) into “Tech Men”, a form of humanity that remembers simple times but cannot exist unless surrounded by tech. Our natural environment now includes an unnaturally high dose of electromagnetic radiation. So, does this sea change in society mean that we must bow down to technology and do its bidding?  In particular, who should decide what and when is this technology healthy?

Figure 1-The growth of ambient exposure as a function of year and by frequency . The red area illustrates the last decade and the little “hillocks” are the future 5G bands. We are already dangerously close to accepted limits. (Reproduced from [1])
We are standing before the next great leap of faith as technology moves us towards 5G and the Internet of Things (IoT). This is sold to us as an Asimov utopia where sensors and devices work for our safety and comfort. Not that we have ever asked for it or do we have a say in what it will be.  So, let’s look at one aspect of it; the exposure to microwave radiation emanating from the cellphone and its infrastructure.

Clearly in a future where we are shrouded by sensors connected to a 5G wireless network and where most of our connection will be via our cellphones, our exposure to ambient microwave radiation is set to rise. A stinging letter by Bandara and Carpenter, published in 2018,  in the prestigious journal The Lancet Planetary health [1] and again in 2020 [2] highlights the unprecedented growth in exposure that we have been subjected to in the last decade.  Figure 1 shows it all.

The current allowable exposure standard for the general public as recommended by the WHO [3], [4] and ICNIRP is 10 W/m2 [5].  We pride ourselves in Israel in insisting on a limit one-tenth of this. Many experts in the field of non-thermal biological effects of cellphone radiation exposure recommend a level far lower than this.  In fact the head of the Russian National Committee on Non-Ionizing Radiation Protection, Yuri Grigoriev, recommends levels as low as 0.01 W/m2 [6], if not lower.  Why?  Because the safety standards of today were really fixed back in the 1950s when in the West science thought that the only effect of exposure to microwave radiation was heating.  Nowadays we know that there are long-term cumulative effects that are non-thermal (not involving heating) in nature[7]–[12]. And that these have definite consequences on public health.  So why are we stuck with an outdated safety standard?  To answer that you need to plunge down the rabbit hole and trace the history.

The Second World War was a great time for microwaves.  The British had invented the magnetron in 1940 and that heralded a veritable new age.  Radar and the Microwave oven (invented in 1946)! And also injury.  As early as 1948 it was noted by researchers in Mayo Clinics that microwave radiation could cause cataracts in dogs [13].  By the 1950s negative effects were being reported by personal working with Radar.  In 1953 a medical consultant, John T. McLaughlin, working for Hughes Aircraft industries filed a report that lists internal bleeding, leukemia, cataracts and headaches as some of the complaints that Radar workers suffered from [13]. This piqued the interest of the Military and so in the West the question arose: what is the safe level?

Diathermy, the use of localized high-frequency electric currents to heat skin, had been known from before the war. So, I suppose it was just natural to think that the origin of all these effects would be microwave heating.  In a very telling article dated 1961 W. Mumford outlined the early history [14].  To quote “Most of the experimental work to date supports the belief that the chief effect of microwave energy on living tissue is to produce heat”. And that seems to have been the guiding principle. The most pressing concern in the 1950s was the eyes and testes of radar operators and in fact it is an important point to note, that the concern was for professionals and not for the general public. From the late 40s until the early 50s a number of industry and academic researchers tried to address the question.  One report that generated wide attention was by F. Hirsch, who reported on the formation of cataracts in a radar technician’s eyes as a result of direct exposure to microwaves [14].  He estimated that the damage threshold would be 100 mW/cm2.  This report was noted by the US Air Force and eventually in 1957 a Tri-Services conference was set up and charged formally with investigating what was safe.  But still with the belief that the mechanism is only heating.  This endeavor was led by a certain Col. Knauf.  They reviewed much of the research that had been carried out in Bell Telephony Labs and other industrial laboratories and commissioned much more.  Some of these studies would not be carried out today, as they would be considered unnecessarily cruel.  “Cooking” the testes of dogs, making rabbits blind and even to lethal exposures at high power.  Eventually, in 1957 they came to a conclusion; the safe level would be 10 mW/cm2.  A provision was added by another great name in the field, Herbert Schwan, that this should be for no more than an hour!  Continued research even questioned this value and even recommended a safety margin.  The General Electric Company recommended only 1 mW/cm2. By 1959 Bell laboratories recommended that “Between 1 to 10 mW/cm2 for incidental or occasional exposure” and “Below 1 mW/cm2 for indefinite exposure”.  1 mW/cm2 is the ICNIRP limit today.  But that isn’t all the story…..

In 1978 a conference was held in Pennsylvania.  In it Dr. Morris Shore, at the time a Director of the Division of Biological Effects, Bureau of Radiological Health of the FDA, admitted that the choice of 10 mW/cm2 was entirely arbitrary! [15]  Even more startling he reported that “Thus it would appear that 10 mW/cm2 was established as a”safe exposure level” by the Air Force in 1957— “arbitrarily” in the words of Dr. Knauf.”.  In fact it is Bell Telephony Laboratories that fixes this limit for their convenience [14], [15].  And so starts a tradition by which the “cat is set to guard the cream”.

All this is happening during the Cold War.  The Soviets are also trying to work out what to do with microwaves.  And they too are asking the same question. Just what is the safe limit?  However, the thinking over there was based on a radically different premise.  The Soviets accepted that there were serious biological implications from exposure and that they had nothing to do with heating.  They recognized that the immune response of their operators was suppressed by excessive exposure.  So they made a series of animal model experiments to confirm this [16], [17]. Noticing non-thermal effects and changes in animal behavior the Soviets set their exposure levels 1000 timeless than the US at 0.01 mW/cm2.   This of course led to intense interest in the intelligence community and the CIA did its utmost to obtain the Soviet research. It was the release of some 1950s CIA translations to the Internet that probably lead to many a spurious conspiracy theory on the military nature of 5G.  Afterall, if the CIA are interested………………?

However, I digress.  Some time around the beginning of the 1960s the American standard, based on no more than a hunch of someone in Bell Labs [15] , becomes the standard that rules us today.   The differing approach to RF safety between the East and West was nicely summed up in a Technical Report by L. David, commissioned by the US Department of Energy in 1980 [17].  I quote:

“To a large deqree, the differences in standards are based on t:untrastlng philosophies. Koslov indicates severol factors that contribute to the differing U.S. and Soviet definitions of permissible microwave exposure, and asserts that the U.S. and the Soviets have fundamental differences in their philosophies of environmental control. In the U.S., the concept of risk/benefit criterion has been accepted, involving the use of an adequate safetymargin below a known threshold of hazard. On the other hand; the Soviets consider a pollutant as any perceptible change in the environment. “

I wish I could report that this attitude has continued over to modern-day Russia, but I have to admit that, while on paper they still maintain the same standard [4], money talks and the exposure is as high there as it is everywhere else.

So how does our story continue?

To be continued……….




[1]          P. Bandara and D. O. Carpenter, “Planetary electromagnetic pollution: it is time to assess its impact,” The Lancet Planetary Health, vol. 2, no. 12, pp. e512–e514, Dec. 2018, doi: 10.1016/S2542-5196(18)30221-3.

[2]          P. Bandara et al., “Serious Safety Concerns about 5G Wireless Deployment in Australia and New Zealand,” Radiation Protection in Australasia, vol. 37, pp. 47–54, May 2020.

[3]          “WHO | The International EMF Project,” WHO. (accessed Feb. 13, 2021).

[4]          T. Wu, T. S. Rappaport, and C. M. Collins, “Safe for Generations to Come,” IEEE Microw Mag, vol. 16, no. 2, pp. 65–84, Mar. 2015, doi: 10.1109/MMM.2014.2377587.

[5]          I. C. on N.-I. R. Protection (ICNIRP)1, “Principles for Non-Ionizing Radiation Protection,” Health Physics, vol. 118, no. 5, pp. 477–482, May 2020, doi: 10.1097/HP.0000000000001252.

[6]          Y. Grigoriev, “Electromagnetic fields and the public: EMF standards and estimation of risk,” IOP Conf. Ser.: Earth Environ. Sci., vol. 10, p. 012003, Apr. 2010, doi: 10.1088/1755-1315/10/1/012003.

[7]          C. L. Russell, “5 G wireless telecommunications expansion: Public health and environmental implications,” Environmental Research, vol. 165, pp. 484–495, Aug. 2018, doi: 10.1016/j.envres.2018.01.016.

[8]          D. Belpomme, L. Hardell, I. Belyaev, E. Burgio, and D. O. Carpenter, “Thermal and non-thermal health effects of low intensity non-ionizing radiation: An international perspective,” Environmental Pollution, vol. 242, pp. 643–658, Nov. 2018, doi: 10.1016/j.envpol.2018.07.019.

[9]          M. Durdik et al., “Microwaves from mobile phone induce reactive oxygen species but not DNA damage, preleukemic fusion genes and apoptosis in hematopoietic stem/progenitor cells,” Sci Rep, vol. 9, no. 1, p. 16182, Nov. 2019, doi: 10.1038/s41598-019-52389-x.

[10]        V. Leach, S. Weller, and M. Redmayne, “A novel database of bio-effects from non-ionizing radiation,” Reviews on Environmental Health, vol. 33, no. 3, pp. 273–280, Sep. 2018, doi: 10.1515/reveh-2018-0017.

[11]        R. N. Kostoff, P. Heroux, M. Aschner, and A. Tsatsakis, “Adverse health effects of 5G mobile networking technology under real-life conditions,” Toxicology Letters, vol. 323, pp. 35–40, May 2020, doi: 10.1016/j.toxlet.2020.01.020.

[12]        A. Chowdhury, Y. Singh, U. Das, D. Waghmare, R. Dasgupta, and S. K. Majumder, “Effects of mobile phone emissions on human red blood cells,” Journal of Biophotonics, vol. n/a, no. n/a, p. e202100047, doi:

[13]        N. H. Steneck, H. J. Cook, A. J. Vander, and G. L. Kane, “The origins of U.S. safety standards for microwave radiation,” Science, vol. 208, no. 4449, pp. 1230–1237, Jun. 1980, doi: 10.1126/science.6990492.

[14]        W. W. Mumford, “Some Technical Aspects of Microwave Radiation Hazards,” Proceedings of the IRE, vol. 49, no. 2, Art. no. 2, 1961, doi: 10.1109/JRPROC.1961.287804.

[15]        10th Annual National Conference on Radiation Control: A Decade of Progress : April 30 – May 4, 1978 Harrisburg, Pennsylvania. U.S. Department of Health, Education, and Welfare, Public Health Service, Food and Drug Administration, Bureau of Radiological Health, 1979.

[16]        M. Repacholi, Y. Grigoriev, J. Buschmann, and C. Pioli, “Scientific basis for the Soviet and Russian radiofrequency standards for the general public,” Bioelectromagnetics, vol. 33, no. 8, Art. no. 8, Dec. 2012, doi: 10.1002/bem.21742.

[17]        L. David, “Study of federal microwave standards,” PRC Energy Analysis Co., McLean, VA (USA), DOE/ER/10041-02, Aug. 1980. doi: 10.2172/5021571.

About the Author
Originally from the UK, I made Aliyah 36 years ago. I am an Academic Staff member of the Physics Department of Ariel University, married with 3 children. I have authored of 80 publications in various fields of Physics and Chemistry. One of the subjects I specialize in is the interaction of Human skin and high frequency radio waves. I am also a scientific advisor for the Environmental Health Trust (
Related Topics
Related Posts