By Wade Allison, professor of physics at Oxford University. Written 20 September 2022
Though an ideal energy source, nuclear made an unfortunate entry into world affairs. Accompanied by frightening tales of destruction it failed early on to gain the confidence required of a leading contributor to future human prosperity. Is radioactivity and nuclear radiation particularly dangerous? It has been wielded as a political weapon for 70 years. But does the myth of a possible radiation holocaust have objective substance? The inhibition that surrounds nuclear radiation obstructs the optimum solution to real dangers today – climate change, the supply of water, food and energy, and socio-economic stability.
Primary Energy Sources
By studying the natural world, humans have succeeded where other creatures failed. Satisfying our needs depends on understanding the benefits that nature offers. In particular, the study of energy and the acceptance by society of improved sources have been critical to prospects for the human race several times in the past. The first occasion was pre-historic, perhaps 600,000 years ago, when fire was domesticated. Confidence and good practice spread through the use of speech and education. Then came the harnessing of sunshine and the weather, delivered by windmills, watermills and the growth of food and vegetation. Nevertheless, these energy supplies were weak and notoriously unreliable. Additional energy was routinely provided by slave labour and teams of animals. Generally though, life was short and miserable.
The use of fossil fuels and their reliable engines began in the 18th Century and displaced the use of intermittent sources. Life was transformed for those who had the fuels. Health, sport, holidays, leisure and human rights flourished, all previously unavailable. Political affairs were largely concerned with which people had access to fossil fuels. Though fossil fuels were never safe or environmental, their combustion probably triggered, if not caused, changes to the climate. Consequently, the decision was taken in Paris in 2015 to discontinue their use. What should replace them? And how may we live in a climate that is never likely ever to revert to the way it was?
Fortunately, natural science today has a firm and complete account of energy – that is apart from one or two intriguing cosmological goings-on such as “dark matter”. Secondary sources, such as hydrogen, ammonia, batteries, electricity and biofuels, are beside the point, because they need to be generated from some primary source, and it’s the latter we need to secure. The weak, unreliable and weather-dependent primary sources that failed previously continue to be inadequate. Without fossil fuels, that leaves only one widely available source, sufficient to support the continuation of society as we know it, namely nuclear energy. It ticks every box, except that many know little about it and are wary of it.
One who learnt early was Winston Churchill. In 1931 he wrote prophetically in the Strand Magazine that nuclear energy is a million times that of the fuel that powered the Industrial Revolution.
Both chemical and nuclear energy can be released explosively. Unfortunately, it was as a weapon that many in society first heard about nuclear energy. Released in anger at Hiroshima and Nagasaki in 1945, the combination of blast and fire produced was fatal to the majority of inhabitants within a mile or two. Those much further away were not affected, nor were those who came to the site weeks afterwards. The result of the nuclear bombs was similar to the destruction by conventional explosives and fire storm in WWII of Tokyo, Hamburg and Dresden – or by explosives in recent years of Chechnya, Aleppo and Mariupol – except that it may come from a single device.
It comes as a surprise to many people that nuclear radiation makes no major contribution to the mortality of a nuclear explosion, even in later years. That is not what they have been told. What is the truth and why has it remained hidden?
Is Radiation a Danger to Life?
A great deal has been learnt about the effect of radiation on life in the past 120 years. When nuclear radiation was discovered by Marie Curie and others in the last years of the 19th Century, they took great care to study its effect on life. Shortly thereafter, high doses were used successfully to cure patients of cancer, as they still are today. Millions of people have reason to be thankful as a result.
As with any new technology, much was learnt from accidents and mistakes in the early days. But by 1934 international agreement had been reached on the scale of a safe radiation dose, 0.2 roentgen per day – in modern units, 2 milli-gray (or milli-Sievert) per day. In 1980 Lauriston Taylor (1902-2004), the doyen of radiation health physicists, affirmed that “nobody has been identifiably injured by a lesser dose”– a statement that remains true today.
At first sight it is strange that ionising radiation, with its energy easily sufficient to break the critical molecules of life, should be harmless in low and moderate doses. And it does indeed break such molecules indiscriminately, but living tissue fights back because it has evolved the ability to do so. In early epochs the natural radiation environment on Earth was more intense than today. Life would have died out long ago, if it had not developed multiple layers of defence. These act within hours or days by repairing and replacing molecules and whole cells, too. Control of these mechanisms was devolved to the cellular level long ago, and it is a mistake for human regulations to try to micromanage the protection already provided by nature. So, although the details of natural protection and its workings are still being discovered today, the effectiveness of the safety it provides were known and agreed already in 1934.
But then in the mid-1950s, in spite of initiatives like “Atoms for Peace” by President Eisenhower, human society lost its nerve about nuclear energy and its radiation. What went wrong?
When fear hid the benefits of nuclear and its radiation
Few today are old enough to remember those days, as I do. The 1950s was an unpleasant time with military threats abroad, spying, secrecy and mistrust at home. In the USA it was the era of Senator Joseph McCarthy when all manner of innocent people were accused of being communist sympathisers or Soviet agents. Suspicion was everywhere. Already following the nuclear bombing of Hiroshima and Nagasaki, knowledge of nuclear radiation was seen as a “no-go” area, supposedly too difficult to understand and beyond the educational paygrade of normal people. After the War a vast employment structure, the industrial military complex, continued to develop, test and stockpile nuclear weapons to the horror of large sections of the populace, worldwide. They were supported in their concern by many scientists, including Albert Einstein, Robert Oppenheimer, Andre Sakharov and many Nobel Laureates. Whether they were knowledgeable in radiobiology or not – and few were – they did not trust the judgement of the military and political authorities with this new energy and its million-fold increase. Everybody was frightened that the power might fall into foreign hands or be used irresponsibly by allies. This fear increased after 1949 when the Soviet Union detonated its first nuclear device. As the years went by, ever larger popular marches and political demonstrations attempted to halt the nuclear Arms Race with the USSR, frequently alarming civil authorities with their threats to law and order.
This civil disturbance had more success in stopping the Arms Race when it focused on the biological effects of nuclear radiation. Few in the industrial military complex knew much about this – they were mostly engineers, physical and mathematical scientists. In truth, few other scientists did either and in the absence of data were easily alarmed. The concern was that irreparable radiation damage incurred by the human genome might be transmitted to subsequent generations. Such a prediction was made by Hermann Muller, a Nobel Prize winning geneticist – without any evidence. A ghoulish spectre of deformed descendants was eagerly adopted by the media as real. The popular magazine Life, dated May 1955 page 37, explicitly quoted Muller, saying “atomic war may cause” such hereditary damage (emphasis added). The qualification of the possibility was lost on the media and general public – the horror was seen as just too awful. It was widely taken as likely to be true by academic opinion, too, as there was no evidence to deny it.
Significantly, it is not difficult to detect levels of radiation exposure many thousand times lower than the level accepted as safe in 1934. Anxious to quell popular pressure, regulatory authorities acceded to a regime in which life should be spared any radiation exposure above a level As Low As Reasonably Achievable (ALARA). For the public, the advice was set at 1 milli-Sievert per year, a modest fraction of the typical natural background received from rocks and space. National regulatory authorities, concerned to protect themselves from liability, readily adopted the advice of the International Commission for Radiological Protection (ICRP) under the auspices of the United Nations.
These regulations are based, not on evidence, but on a philosophy of caution, namely that any exposure to radiation is harmful and that all such damage accumulates throughout life – in denial of the natural protection provided by evolution. A discredited ad hoc theory of risk, the Linear No Threshold model (LNT)[9,10], supplanted the Threshold Model of 1934 at the behest of the BEAR Committee of the US Natural Academy of Sciences in 1956.
Such excessive caution incurs huge extra costs. Worse, adherence to ALARA/LNT regulations has caused serious social and environmental damage – for instance, in the response to the accidents at Chernobyl and Fukushima Daiichi. International bodies and committees, unlike individuals, stick rigidly to their terms of reference. So, the ICRP still supports ALARA/LNT today and advocates protection which is not necessary – except in extreme cases.
What about these extreme cases? Muller supposed that an exposure to radiation can alter a person’s genetic code and that this error can then be passed onto off-spring. But the medical records of the survivors from Hiroshima and Nagasaki, their children and grandchildren never supported this. As a result, nobody today maintains that there is any evidence for such inheritable genetic changes. This is confirmed in animal experiments, and was accepted even by the ICRP in 2007 – to be precise they lowered their estimated genetic risk factor by an order of magnitude. So Muller was wrong. Incidentally, he was also wrong about the evidence for which he received the Nobel Prize in 1946.
Dedicated to protect people against radiological damage, the ICRP focused on the induction of cancer by radiation instead of inheritable genetic defects. The medical history of 87,000 survivors of Hiroshima and Nagasaki, along with their children, have been followed since 1950. Data on solid cancers and leukaemia in 50 years and their correlation with individually estimated exposures have been published by DL Preston et al (, Tables 3 and 7). Inevitably, some survivors died from these diseases anyway, but their numbers are allowed for by comparing with distant residents who received no dose, being too far away. Some 68,000 survivors received a dose less than 100 milli-Sievert and these showed no evidence of extra cancers. Altogether, between 1950 and 2000 there were 10,127 deaths from solid cancers and 296 from leukaemia – 480 and 93, respectively, more than expected on the basis of data for those not irradiated. This number of extra deaths, 573, is significant, but less than half a percent of those who died from the blast and fire. Furthermore, it is only a third of the number of deaths reported as caused by the unnecessary and ill-judged evacuation at Fukushima Daiichi, an accident in which nobody died from radiation, or is likely to. Evidently, the fear of radiation can be far more life-threatening than its actual effect, even as recorded in the bombing of two large cities. This conclusion in no way belittles the enormous loss of life from the blast and fire of a nuclear explosion with its localised range and limited duration.
But it is important to check that all available evidence corroborates this conclusion. How are other biological risks checked? A new vaccine is checked with blind tests in which patients are unaware of whether they have been treated or been given a placebo. In similar studies with radiation on groups of animals, one is irradiated every day throughout life and the other not. Those irradiated daily show a threshold of about 2 milli-Sievert per day for additional cancer death or other life shortening disease, similar to the threshold set in 1934. In fact doses below threshold increase life expectancy and the same is found for humans.
At Chernobyl 28 fire fighters died of acute radiation syndrome in a short time, 27 from doses above 4000 milli-Sievert and 1 from a dose between 2000 and 4000 milli-Sievert. There were 15 deaths from thyroid cancer (but opinion is divided on these). Other cases of ill health were related to severe social and mental disturbance. Being told “you have been irradiated and are being evacuated immediately” is disorientating. Like Voodoo or a mediaeval curse, it can be life-threatening. Notably, the wild animals in the Chernobyl Exclusion Zone are thriving, as seen on wildlife programmes[19, 20] – but then they have not been shown videos on the horrors of radiation!
An important question is how human society has persisted with such a gross misperception for seventy years. Entertainment, courage and excitement are important emotional exercises that prepare us to face real dangers, although there is a need to distinguish fact from fiction. The Placebo Effect describes the genuine health benefits found by patients who think they have been treated when they have not. The Nocebo Effect is its inverse, that is where people who have not been harmed, suffer real symptoms as if they had. In the aftermath of the Fukushima accident families endured terrible suffering including family break up and alcoholism – as a direct consequence of regulations based on ALARA and LNT. If the regulations had been based on the 1934 threshold, no evacuation longer than a week would have been justified.
The nuclear option for generations to come
Evidently, committees that advocate regulation based on ALARA/LNT are harmful and should be disbanded. Future generations should be free to make informed decisions involving nuclear energy, in peace or war, unencumbered by the erroneous legacy of the 1950s.
In years to come, when reference is made to the “nuclear option” in other contexts, we may hope that it will be shorthand for “the best solution”. In medicine this is nearly true now. During a course of radiotherapy the healthy tissue close to a tumour receives a high dose – about 1000 milli-Gray, every weekday for several weeks. By spreading the treatment over many days, this healthy tissue just recovers, and radiologists ensure that this huge dose seldom causes a secondary cancer. This would be disastrous strategy according to LNT – in six weeks or so the equivalent of about 30,000 years at the precautionary dose limit of 1 milli-Sievert per year!
In future we should not allow ourselves to be blackmailed by fear of the radiation from a nuclear weapon. That may have terrified our parents, but we should ensure that our children understand that radiation is dangerous only in the immediate vicinity of a nuclear detonation where death is caused by the blast and fire. At school all teenagers should study natural science and understand how nuclear energy compares with other sources, for safety, availability, reliability, security and preservation of the environment. Then they should go home and reassure their parents.
Professor Wade Allison, Oxford, United Kingdom, 20 September 2022
National Research Council (1956). Effect of Exposure to the Atomic Bombs on Pregnancy Termination in Hiroshima and Nagasaki. Washington, DC: The National Academies Press. https://doi.org/10.17226/18776 .
Finding sufficient energy is essential to all life. Humans have excelled at this, notably when they studied and overcame their innate fear of fire some 600,000 years ago. Until the Industrial Revolution they made do with energy derived, directly or indirectly, from the daily sunshine that drives waterpower, the wind and other manifestations including the production of vegetation and food. But, although better than for other creatures, human life was short and miserable for the population at large. The causes were the anemic strength of the Sun’s rays, averaging 340 watts per square meter, and its random interruption by unpredicted weather.
With fossil fuels, available energy increased, anywhere at any time. Life expectancy doubled and the world population quadrupled. For 200 years whoever had access to fossil fuels had world power. However, at the 2015 Paris Conference nations agreed that the emission of carbon posed an existential threat and that, sooner rather than later, this should cease.
Technology may be challenging and exciting, but it cannot deliver energy where none exists, today as in pre-industrial times. Writing in 1867, Karl Marx dismissed wind power as “too inconstant and uncontrollable”. He saw waterpower as better, but “as the predominant power [it] was beset with difficulties”. Today, the vast size of hydro, wind and solar plants comparative to their power reflects their weakness and destructive impact on flora and fauna – a point often curiously ignored by environmentalists.
If renewables are simply inadequate and fossil fuel emissions only accelerate climate change further, what abundant primary energy source might permit political and economic stability for the next 200 years? Natural science can say without doubt, the only answer is nuclear.
In 1931, Winston Churchill wrote: “The coal a man can get in a day can easily do 500 times as much work as the man himself. Nuclear energy is at least one million times more powerful still… There is no question among scientists that this gigantic source of energy exists. What is lacking is the match to set the bonfire alight… The discovery and control of such sources of power would cause changes in human affairs incomparably greater than those produced by the steam-engine four generations ago.”
He was right, but this transition requires adequate public education. In recovering from World War Two and its aftermath, the world lost confidence and demonised nuclear energy. This denial of an exceptional benefit to society has persisted for 70 years supported by bogus scientific claims around radiation and oil interests. But, aside from the blast of a nuclear explosion, nuclear energy and its radiation are safer than the combustion of fossil fuels, as confirmed by evidence from Hiroshima and Nagasaki, Chernobyl, and Fukushima. Furthermore, nuclear applications in medicine pioneered by Marie Curie (such as the use of radiation to treat cancerous tumours) have been widely appreciated for 120 years.
Regulation around nuclear needs to be commensurate with actual risk, and it should be financed appropriately, with richer nations covering the costs for developing countries.
Fully informed, everybody should welcome the security of small, mass-produced, cheap, local nuclear energy plants dedicated to serving modest-sized communities for 80 years with on-demand electricity, off-peak hydrogen, fertiliser, industrial heat, and seasonless farming.
The only real challenges are in building a new generation with the relevant scientific knowledge and skills, and instilling public confidence.