July 2022 Newsletter

From the Editor

Oriol T. Valls, the current Physics and Society Newsletter Editor, is a Condensed Matter theorist at the University of Minnesota.

Continuing the policy of publishing articles by Forum award winners and invited speakers, I am pleased to announce that we have in this issue an article by our latest Szilard winner, Michael Mann, and  one by Sakharov  prize winner, von Hippel. The topic of this last article is, fittingly, Sakharov. I hope to have an article by the latest Burton award winner (Jaffe) in the next issue.

I am very  happy to note that the contents of the April issue elicited a large amount of feedback! I had  expected to get some from my comments about arXiv and I did get a few, although  mostly unpublishable, but there were many  more on Ma’s article about fusion. You can see that reflected in  several Letters to the Editor, and also on some articles.

This is a very  good thing:  in every issue I emphasize that I welcome controversy. Controversy is good, light and truth emerge from arguments. My only restriction is   that “ad hominem” arguments are not acceptable. May the  inflow continue.

Once more:  the contents of this newsletter are largely reader driven.  Please send your contributions and your suggestions.  All topics related to Physics and Society,  very broadly understood, are welcome. No pertinent controversial subject needs to be avoided.  Content is not peer reviewed and opinions given  are the author’s only, not necessarily mine, nor the Forum’s nor, a fortiori, the APS’s either .  Letters to the Editor for publication are also welcome . The APS production people prefer MSWord formats. Book reviews  should  be sent  to the reviews editor directly (ahobson@uark.edu) . Everything else goes to me.

Oriol T. Valls                                                                                                                                          
University of Minnesota                                                                                                                             


Forum News

Reviews Editor Wanted

Art Hobson is retiring as our Reviews Editor, after many years of outstanding service. A volunteer is needed to replace him.

The Reviews Editor (RE) is in charge of the Book Reviews section of this Newsletter. The RE has traditionally been given by the Editor complete autonomy as to the choice of what books to review and which reviews to publish. This policy will continue. In general, reviews are written by individual volunteer reviewers, selected by the RE, who also chooses the books among suitable new publications.  The list of active reviewers (there are about 25) will be made available to the new RE. Free review copies are obtained from the publishers, who are glad to see their book reviewed in a publication sent to 5,000-6,000 physicists.

The main qualities needed are love of books, reasonable ability to write and edit, and the consistency to maintain a regular flow of about two books per issue of P&S.  Please look at past issues for style.

If you have these qualities, you will find the work very rewarding, and not too onerous on your time. Please email the Editor at otvalls@umn.edu to volunteer and to ask for any additional information.

The starting date for the position is for the January 2023 issue.  Art will cooperate in every respect to help the new RE get started.


Letters to the Editor

Dear Editor,

Re Fostering a New Era in Inertial Confinement Fusion Research’ by Tammy Ma, April 2022 APS Forum on Physics and Society newsletter.

The one-time 1.35 MJ yield on NIF is indeed a commendable technical achievement albeit irreproducible so far. However the perils of nuclear weapons have come into sharper focus recently. The Ukraine situation shows the utter folly of devoting huge resources to providing politicians with better H-bombs so they can more easily turn the planet into a radioactive wasteland !

Re: the energy application, the gap between NIF and practical Inertial Fusion Energy seems as daunting as ever. Omar Hurricane, the leader of the LLNL team has stated … there is zero net energy gain as compared to the electricity we pulled to do the experiments. This is one of the reasons why I view our scheme as not being practical for energy production.”[1].  This impracticality is actually a good thing, as a cursory reading of the world political situation in any daily newspaper shows that a potential global proliferation of inertial fusion power stations and associated inertial fusion technology would imperil the future of mankind !  There is a good reason that the NIF program remains classified.

In the face of rising sea levels, uncontrolled wildfires, rising temperatures and the devastation of the biosphere, our urgent priority must be to finding ways to avert a climate catastrophe in the next decades. For example, generating green hydrogen from abundant wind and solar installations seems a far more promising way to provide 24/7 power than fusion 'fairy tales'[2]. I urge our funding agencies and scientific talent to vigorously foster and research climate solutions and not be taken in by a ‘greenwashing’ program to further develop nuclear weapons.

Charles Skinner
Senior Scientist, Princeton Plasma Physics Laboratory
(Opinions expressed are solely the author's)


[1] https://www.cnbc.com/2021/08/17/lawrence-livermore-lab-makes-significant-achievement-in-fusion.html

[2] The Fairy Tale of Nuclear Fusion L. J. Reinders, Springer Nature Switzerland AG, 2021, 621 pages, ISBN 978-3-030-64344-7


Dear Editor,

Tammy Ma had an article in the April 2022 edition of FPS advocating that with the success of the recent LLNL/NIF experiments [1], a program should be set up in the Department of Energy for inertial fusion energy production, instead of only stockpile stewardship.  This author heartily endorses such step. 

As some readers of FPS may know, this author has advocated fusion breeding as an alternative to what he has called ‘pure fusion’ [2,3,4].  Fusion breeding is the use of 14 MeV fusion neutrons to breed fuel for thermal nuclear reactors.    While there are fission options for breeding, and they have a much shorter development path, fusion breeding has important advantages.  It takes two fast neutron reactors like the IFR (integral fast reactor) to fuel a single LWR (light water reactor) of equal power.  But a single fusion breeder could fuel 5 or 10.

Fusion breeding can occur using either magnetic or inertial fusion [2].  However, after 2014, this author’s series of articles (cited in 3 and 4) have, until recently, concentrated on only magnetic fusion, as only it has produced significant 14 MeV neutrons in a DT plasma.  But given the recent LLNL/NIF results, that seems to be changing.

There are two ways of looking at these results.  The negative view is to emphasize the fact that the result achieved a gain of ~ 7% of the gain of 10 that the lab promised in 2004 and 2011 [5,6].  Also, it is more than a decade late, more than $10B over budget, and the next few shots have not been able to repeat it.  However, the positive view, which this author certainly shares, is that it is an extraordinary scientific achievement.  Among other things, it showed that as the imploded plasma expanded, it heated, even though this was after then laser was off, thereby, proving an alpha heated plasma.  In other words, NIF produced a burning plasma nearly 2 decades before ITER hopes to do so. In a short time, Ref (1) has been downloaded ~ 80 thousand times!

It is worth providing quotes from Refs (2 and 3).  These are in a different font, with a few additional hind sight comments added in red.  The first and second quotes are from (2).  These describes the situation as I saw it in 2014:

Despite the natural advantages of inertial over magnetic fusion pointed out earlier, magnetic fusion is still way ahead of inertial fusion in neutrons produced and gain achieved. MFE certainly has not had a disaster of this magnitude (namely missing its calculated gain by 3 orders of magnitude) in an expensive machine. It serves nobody’s interest to deny this or to attempt to sweep it under the rug. Inertial fusion’s credibility today is not exactly riding high. Surely congress will never approve another multi billion (or even multi million) dollar inertial fusion machine until NIF achieves some measure of success. To be blind to this is simply to live in a dream world. As regards inertial fusion, NIF is what we’ve got, it is all we’ve got, and it is all we will have for quite some time. The only reasonable goal now is to get it to work, assuming congress does not get annoyed and pull the plug. Hopefully this will not happen.

The second, also from (2) is a brief summary, with much more detail in (2), of one possible approach to getting a significant achievement form NIF:

Also, direct drive gain calculations show impressive gains (of ~ 100) at half a megajoule laser energy. NIF has nearly 4 times this. Hence there is a very large margin for error both regarding the laser energy and the gain calculations. Let’s say NIF does a symmetric direct drive experiment and gets a gain of ‘only’ 10. (If I had it to do over again, I would have said ‘only’ one or two.) Wouldn’t this be a tremendous accomplishment? It might be just 2 or 3 years away. Might it not encourage Congress to build the ideal laser or lasers Bodner suggests?

Here is a more contemporary comment from (3) this year:

Except for [Manheimer 2014] the publications just cited did not consider inertial fusion. It had simply never achieved the neutron production magnetic fusion has. However, the recent Lawrence Livermore National Lab (LLNL) results achieved about 1.3 MJ of fusion products, produced by about 1.7 MJ of laser energy for a Q of ~0.7, about the same as the TFTR and JET DT plasma experiments [Callahan, 2015 whoops, obviously I meant 2021]. However, the laser driver has a much lower efficiency than the beam and microwave drivers for the tokamak. Reference [10 i.e .Ref. 2 here] did discuss a possible path for laser fusion, and now that there are the LLNL results are in, it could be a time to reconsider inertial fusion. After all, there are very many mega Amp class tokamaks; but very few mega Joule class lasers. Perhaps it is the time for a few more.

Yes, it is time to get serious about IFE for energy, i.e, fusion breeding and pure fusion, not just stockpile stewardship.  It is time to build other hundred kilo Joule, and even mega Joule lasers, lasers with the bandwidth, efficiency, and rep rate capability more relevant to IFE energy production.

Wallace Manheimer
Retired from NRL


[1] Zylstra, A.B., Hurricane, O.A., Callahan, D.A. et al. Burning plasma achieved in inertial fusion. Nature 601, 542–548 (2022). https://doi.org/10.1038/s41586-021-04281-w

[2] Manheimer, W. (2014). Fusion breeding for mid-century sustainable power. J. Fusion Energy, 33, 199–234. https://doi.org/10.1007/s10894-014-9690-9

[3] Manheimer, W (2022), Civilization needs sustainable energy – Fusion breeding might be best, Journal of Sustainable Development; Vol. 15, page 98, https://ccsenet.org/journal/index.php/jsd/article/view/0/46729

[4] Manheimer, W. Magnetic fusion is tough- if not impossible- fusion breeding is much easier. Forum of Physics and Society. July 2021, https://higherlogicdownload.s3.amazonaws.com/APS/04c6f478-b2af-44c6-97f0-6857df5439a6/UploadedImages/P_S_JLY21.pdf

[5] 5. J. Lindl et al., The physics basis for ignition using indirect-drive targets on the National Ignition Facility. Phys. Plasmas 11, 329 (2004)

[6] 6. S.W. Haan et al., Point design targets, specifications, and requirements for the 2010 ignition campaign on the National Ignition Facility. Phys. Plasmas 18, 051001 (2011)


Dear Editor,

Your description of the controversy regarding redacted papers in the arXiv system (P&S, April 2022) reminded me of an experience of my own with that system, albeit not one as dramatic as the story he relates. Early last year I submitted a paper to the History and Philosophy of Physics area of arXiv. This was a very conventional pedagogical piece on how one could use basic concepts of cross-sections and scattering to reverse-understand why the dimensions of fuel elements, lattice spacing, and the overall size of Enrico Fermi’s CP-1 nuclear pile were chosen to be what they were. Upon submission the paper was put on hold, and a few days later I received a message that it was deemed to be of insufficient interest (or something like that) to warrant posting. The paper subsequently underwent conventional review and was published in European Journal of Physics; I was recently pleased to see that it has garnered some 60 downloads [1].      

            As Dr. Valls points out, the whole point of arXiv is to be an open, non-peer-review environment. Overall, it is a fantastic resource which has helped to democratize the dissemination of scientific information, and hardly a day goes by that I do not check to see what has been posted. But the arXiv system of moderators and endorsers seems to go against the open model and can be frustratingly opaque when one ends up on the wrong end of it.

Department of Physics (Emeritus)
Alma College
Alma, MI 48801


[1] B. Cameron Reed, “Estimating the size of Fermi’s CP-1 nuclear pile: a classroom approach,” Eur. J. Phys. 42 (2021) 055801 (12pp)



Reflections on Leo Szilard, The Fragility of Truth and the Role of the Scientist in the Public Sphere

Michael E. Mann

Leo Szilard was both a great physicist and a remarkable human being. He used his brilliant mind not only to solve fundamental problems in physics but to fight back against a rising tide of fascism in the 1930s. Nearly a century later, that battle is unfortunately not over.

Szilard founded the Council for a Livable World in 1962, just two years before he passed away. That battle too is unfortunately not over, as we deal simultaneously with the threats of an unprecedented pandemic, the renewed spectre of nuclear conflict, and the existential climate crisis.

I'm proud to have started out in the same sub-field as Szilard, statistical physics. Depending on your perspective I either saw the light, or lost my way. Mid-PhD I set out on a self-avoiding random walk, and ended up applying my math and physics training to the problem of climate modeling. Ultimately my forays would lead me to publish the now iconic "hockey stick" curve, and I would find myself at the very center of the fractious debate over human-caused climate change.

Despite the bruising battles as I've sought to defend the science of climate change—including my own work-- from attacks by vested interests aiming to discredit it, I consider myself privileged to have found myself in a position to influence the public discourse over the greatest challenge we face as a civilization.

Having received the APS Szilard Award this year, I'm truly humbled by the list of past recipients, including heroes of mine such as Sherry Rowland, Andrei Sakharov, fellow climate scientist James Hansen, and last, but surely not least, the great Carl Sagan.

More than two decades ago Sagan, perhaps the leading science communicator in a generation, warned in his masterful tome “The Demon-Haunted World”, of a time in his children or grandchildren’s future, when “awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority”.  When “our critical faculties in decline…we slide, almost without noticing, back into superstition and darkness”.

He feared a “dumbing down…most evident in the slow decay of substantive content in the enormously influential media, the 30 second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance”.

Sagan’s daughter Sasha—a gifted thinker and author in her own right—gave birth to his granddaughter just a few years ago. And so his first grandchild was indeed born into the very era of “alternative facts” and “fake news” he presaged.

We live at time when social media has increasingly been weaponized by bad actors to promote conspiracy theories and lies, and to undermine faith in governmental and civic institutions, our traditional media, expert opinion and, yes, scientists—indeed, anyone, whose views or findings might prove inconvenient to their political, financial, or ideological agenda. That includes public health experts offering their advice on how to deal with an unprecedented pandemic, and—of course---climate scientists offering our informed advice when it comes to the greatest challenge and threat of all—the climate crisis.

Our mission as scientists has never been more important as it is right now. We must step up and do battle in what is a genuinely Tolkienesque assault on science, reason, and fact-based discourse.

Our ongoing reliance on fossil fuels is in fact at the root of the twin battles we are fighting right now. A battle, on the one hand, to defend western democracy itself-- from a brutal, barbarous assault by an authoritarian petrostate. And on the other hand, a battle to avert catastrophic climate change, while there is still time. These battles have been made all the more challenging by rampant disinformation that has flooded our online information ecosystem with falsehoods and outright lies.

Efforts to attack and deny the scientific evidence for human-caused climate change  have long constituted a major impediment to action. But, as I argue in my book “The New Climate War”, we appear now to be moving past outright denial of the basic science as the evidence becomes plain to the person on the street in the form of unprecedented heat waves, droughts, wildfires, floods and superstorms.

We still however face a multi-pronged strategy by polluters and their enablers in the media and pundit class to distract, deflect, attack, divide, and delay. Among their preferred tactics today is the promotion of risky, unproven strategies, such as geoengineering or massive carbon capture and sequestration, and the promise of future action as an excuse for business-as-usual fossil fuel burning today.

There is still time to for us to avert the worst impacts of climate change if we act now and we act boldly, but there is no time left for dead ends, wrong turns and false solutions. We have the technology—in the form of renewable energy, storage technology, and efficiency and conservation measures. The only obstacles at this point aren’t the laws of physics, but the flaws in our politics.

Department of Meteorology & Atmospheric Science
Penn State University

Andrei Sakharov’s Contributions to Nuclear Arms Control

Frank N. von Hippel

A complete destruction of cities, industry, transport, and systems of education, a poisoning of fields, water, and air by radioactivity, a physical destruction of the larger part of mankind, poverty, barbarism, a return to savagery, and a genetic degeneracy of the survivors under the impact of radiation, a destruction of the material and information basis of civilization – this is a measure of the peril that threatens the world as a result of the estrangement of the world’s two superpowers.[1]

Andrei Sakharov originally became prominent within the Soviet nuclear-weapons program in the 1950s for his contribution, in a desperate race with the United States, to two of the three ideas key to the development of Soviet thermonuclear weapons.[2] The US program during World War II had been driven by the fear that Germany would develop nuclear weapons first. It turned out, that there was no serious German nuclear weapon program. The Soviet US nuclear arms race was real.

After the fully developed Soviet thermonuclear weapon was tested in 1955, Sakharov began to think about the implications of what his team and their American counterparts had achieved. 

Atmospheric test ban. Initially, Sakharov was concerned about the health impacts of the radioactivity created by the ongoing testing in the atmosphere of the bombs that both the Soviet Union and United States had developed with yields equivalent to millions of tons (megatons) of chemical explosive equivalent.  

Sakharov did not focused, however, on the fission products, such as strontium-90 and cesium-137 in local and global fallout that were the dominant concern in the public debate at the time.  Instead, he focused on the huge amounts of carbon-14 being generated in the atmosphere by neutrons from the nuclear explosions replacing protons in nitrogen-14 nuclei, converting them to carbon-14.  Carbon-14 decays back into nitrogen-14 with a 5730-year half-life.

Sakharov was concerned that the carbon-14 from atmospheric testing would pollute the biosphere for thousands of years and estimated that, for a steady-state global population, each megaton exploded in the atmosphere would yield a dose per person added up over all the generations of 0.75 millirads.[3] Two decades later, the UN’s Scientific Committee on the Effects of Atomic Radiation came up with an estimate of 0.5 millirads/Mt.[4] It also estimated that, integrated over the time, the dose from carbon-14 would be dominate that from the fission products.

Sakharov’s concern may have stemmed in part from his knowledge of the Soviet Union’s plans for a series of huge atmospheric tests during 1961-62.  During those two years, the Soviet Union set off 338 megatons of tests in the atmosphere, 55% of the cumulative 545 megatons of atmospheric tests by the first five nuclear-weapon states from the first US test in 1945 till the last Chinese test in 1980.[5]

Assuming an equilibrium global population of ten billion, the latest National Academies cancer risk estimates[6] would lead to an estimate of 1.3 to 5.2 million cancers from the carbon-14 resulting from that atmospheric nuclear testing. About half of those cancers would be fatal today.  These consequences will be reduced somewhat by the dilution effect of the fossil carbon that humanity is dumping into the atmosphere.

Because of his status as the superstar of the Soviet nuclear-weapon designers, Sakharov was able to take his concerns directly to the top.  He did so in 1961 at a meeting on nuclear weapons with then Soviet leader Nikita Khrushchev.  Sakharov argued that the Soviet Union had little to gain technically from the planned tests. As Sakharov recalled, Khrushchev responded a tirade at the dinner after the meeting,

“Leave politics to us – we’re the specialists…we have to conduct our policies from a position of strength…Our opponents don’t understand any other language…Sakharov, don’t try to tell us what to do or how to behave.  We understand politics. I’d be a jellyfish and not Chairman of the Council of Ministers if I listened to people like Sakharov!”[7]

After the October 1962 Cuban Missile Crisis, however, the Soviet and US governments, having frightened themselves and the world by their near miss with Armageddon, became more serious about negotiating a Comprehensive Nuclear Test Ban.  When these negotiations bogged down over the question of whether sub-megaton underground explosions could be concealed, Sakharov suggested that the Soviet side accept a proposal that had been made by the US side during the Eisenhower administration: a ban on atmospheric explosions alone. In 1963, the negotiators agreed to ban nuclear explosions everywhere except underground.[8]

The Intermediate-range Nuclear Forces and Strategic Arms Reduction Treaties

In 1968, Sakharov ended his career as an insider with his remarkable book, Reflections on Progress, Peaceful Coexistence, and Intellectual Freedom, which was “published” by people typing copies as they read it. One of these copies was taken to the Netherlands. It was published by a Dutch newspaper and then in three full pages of the New York Times on 22 July 1968. More than 18 million copies of the book were published around the world.[9]

This marked the end of Sakharov’s life as a nuclear-weapon designer and the beginning of his life as a political activist and defender of other political activists in Moscow. Given his exalted status, this was tolerated by the Soviet leadership until they reached the breaking point in early 1980, when Sakharov criticized the Soviet 1979 invasion of Afghanistan publicly in an interview with a New York Times correspondent. Sakharov and his wife were shipped to the closed city of Gorky beyond the reach of foreign journalists and were only allowed back to Moscow seven years later at the end of 1986 in Mikhail Gorbachev’s second year as General Secretary of the Soviet Communist Party.

Sakharov’s return occurred just before the February 1987 Forum for a Nuclear-Free World and the Survival of Mankind that was organized by the physicist, Evgenyi Velikhov, one of Gorbachev’s arms control advisors.  The Forum included parallel conferences of scientists, writers, medical doctors, etc. Sakharov attended the scientists’ conference, which I organized with Velikhov, and his first public pronouncements after his seven years of imprisonment in Gorky were awaited with great public interest.

Sakharov’s highest-impact statements at the Forum related to an issue that was blocking the first Soviet-US agreements on deep cuts in their nuclear arsenals.  Gorbachev and Reagan had met in Geneva in 1985 and had agreed in principle on cutting in half the number of warheads carried by their long-range “strategic” ballistic missiles and bombers.[10] But, despite the limitations on ballistic missile defense that the two countries had agree to in their 1972 treaty limiting anti-ballistic-missile defenses (the ABM Treaty), Reagan refused to agree to limits on his Strategic Defense Initiative (SDI) and Gorbachev refused to agree to cuts in offensive weaponry in the absence of continuing limits on defense.  The same impasse had occurred again at the Reykjavik Summit in 1986. There, Gorbachev agreed to eliminate intermediate-range ballistic missiles entirely and reiterated his willingness to reduce deployed strategic warheads by half but again on the condition that Reagan limit SDI.

Sakharov urged that Gorbachev drop the linkage.  He and his colleagues at the Soviet Union’s first nuclear-weapons laboratory had studied the feasibility of ballistic missile defense two decades earlier during discussions of what became the 1972 ABM Treaty and had concluded, in Sakharov’s words:

“A way can always be found to neutralize an [Anti-Ballistic Missile] defense system – and at considerably less expense than the cost of deploying it” and

“deployment of an ABM system is dangerous since it can upset the strategic balance [i.e. advantages the side that strikes first]” [11]

At the scientists’ forum, Sakharov described Reagan’s proposed strategic defense initiative as

 “a ‘Maginot line in space’ – expensive and ineffective”.[12]

He therefore argued that

“A significant cut in ICBMs [intercontinental ballistic missiles] and medium-range and battlefield missiles and other agreements on disarmament should be negotiated as soon as possible, independently of SDI, in accordance with the lines of understanding laid out in Reykjavik.  I believe that a compromise on SDI can be reached later. In this way the dangerous deadlock in the negotiations could be overcome.”[13]

Quite a few advisors to Gorbachev had been making the same argument to him in private but Sakharov making the argument public may have helped.  Two weeks after the Forum, the Soviet Union dropped the linkage between SDI and intermediate-range missiles and Gorbachev and Reagan signed the INF Treaty in Washington at the end of the year.  It took two more years, till late September 1989 however, before the linkage was dropped for ICBMs.[14] The Strategic Arms Reduction Treaty (START) complete with complex protocols for onsite inspections was signed by Gorbachev and President George H. W. Bush in Moscow in July 1991, a month before the August hardline coup against Gorbachev. The coup was foiled by Boris Yeltsin but Gorbachev was sidelined and the Soviet Union disintegrated by the end of the year.

Sakharov on Silo-based ICBMs

In one of his contributions at the 1987 Scientists Forum, Sakharov commented that

“A large part of the USSR’s thermonuclear capability is in powerful, silo-based missiles with multiple warheads.  Such missiles are vulnerable to a preemptive strike by the modern highly accurate missiles of the potential enemy…A country relying mainly on silo-based weapons may be forced in a critical situation to launch a first strike…today [silo-based ICBMs] constitute the principal source of military strategic instability.”[15]

In fact, Russia was already diversifying away from silo-based ICBMs. In 1985, it began deploying both the single-warhead, truck-mounted SS-25 and the 10-warhead railway-mobile SS-24.[16] In 2022, Russia has about half of its ICBM warheads on truck mobile missiles.[17] Russia’s other ICBM warheads were in 135 silos with an average of about four warheads per silo – still a considerable source of instability.

The US planned to shift to mobile missiles, but with the large 10-warhead MX missile which was so heavy that it would require its own heavy-duty roads and so much water for the concrete that it was vetoed by its proposed host states, Nevada and Utah.  Fifty MX missiles were eventually deployed in silos beginning in 1986. The last was retired by 2005. Today, the US has 400 Minuteman III missiles deployed in silos with single warheads.

Because of the vulnerability of these silo-based missiles, in 1978, the US “temporarily” created a launch-on-warning option.  Russia may have done the same for its silo-based ICBMs. Because of false warnings, this posture has been very controversial in the US and has been a subject of great debate.[18] Strategic Command has insisted, however, and has launched a program to replace the silo-based Minuteman III missile with a new “Sentinel” missile in the same posture beginning around 2030. 

Summary and assessment

Sakharov contributed importantly to the development of the US-Soviet nuclear-deterrence relationship in the 1950s and 1960s. But he was not at all comfortable with mutually assured destruction (MAD).

As an insider, he helped achieve the atmospheric test ban of 1963 and the ABM Treaty of 1972.

As an outsider, he helped Gorbachev and Reagan achieve the INF and START treaties, the first treaties that actually reduced nuclear weapons.

He did his weapons work as a brilliant cog in a huge bureaucratic machine.

His arms-control work he did as an autonomous human being.

Note: I have depended heavily on Sakharov’s extraordinary 2-volume memoir.  He began to write the first large volume in 1978. It covers his story through his return from Gorky to Moscow at the end of 1986. Parts of the manuscript and source materials were stolen four times by the KGB but Sakharov stubbornly reconstructed the manuscript.  The second volume covers up to December 1989.  Sakharov died of a heart attack on 14 December 1989 aged 68 but, as with most things, his efforts in writing his memoir were excellent and provide insight into the experiences and development of this remarkable man.

Frank N. von Hippel, a physicist, is a Professor of Public and
International Affairs emeritus affiliated with Princeton University’s
Program on Science and Global Security.


[1] Andrei Sakharov, Progress, Coexistence, and Intellectual Freedom (New York Times Company, 1968) p. 37.

[2] Andrei Sakharov, Memoirs (Alfred A. Knopf, 1990) pp. 102, 182.

[3] Andrei Sakharov, “Radioactive Carbon from Nuclear Explosions and Non-Threshold Biological Effects,” Atomic Energy, June 1958; reprinted in Science and Global Security, vol. 1 (1990) pp. 175-187, https://scienceandglobalsecurity.org/archive/sgs01sakharov.pdf.        

[4] UNSCEAR, Ionizing Radiation: Sources and Biological Effects (UN, 1982), p. 243, https://www.unscear.org/docs/publications/1982/UNSCEAR_1982_Annex-E.pdf.

[5] Ionizing Radiation: Sources and Biological Effects 1982, p. 227.

[6] Health Risks from Exposure to Low Levels of Ionizing Radiation (National Academy Press, 2006) p. 281, https://nap.nationalacademies.org/catalog/11340/health-risks-from-exposure-to-low-levels-of-ionizing-radiation.

[7] Memoirs, pp. 215-217.

[8] Memoirs, pp. 231-2.

[9] Memoirs, p. 288.

[10] “Joint Soviet-United States Statement on the Summit Meeting in Geneva,” 21 November 1985, https://www.presidency.ucsb.edu/documents/joint-soviet-united-states-statement-the-summit-meeting-geneva.

[11] Memoirs, pp. 267-8.

[12] Andrei Sakharov, Moscow and Beyond: 1986 to 1989 (Alfred A. Knopf, 1991) p. 22.

[13] Ibid, p. 21.

[14] Ibid, p. 23.

[15] Ibid, pp. 19-20.

[16] Russian Strategic Nuclear Forces, Pavel Podvig, ed. (MIT Press, 2001).

[17] Hans Kristensen and Matt Korda, “Nuclear Notebook: How many nuclear weapons

does Russia have in 2022,” Bulletin of the Atomic Scientists, 23 February 2022, https://thebulletin.org/premium/2022-02/nuclear-notebook-how-many-nuclear-weapons-does-russia-have-in-2022/.

[18] National Security Archive, “The [US] “Launch on Warning” Nuclear Strategy and Its Insider Critics,” 11 June 2019, https://nsarchive.gwu.edu/briefing-book/nuclear-vault/2019-06-11/launch-warning-nuclear-strategy-its-insider-critics.


Magnetic Fusion’s Finest is a Glorified Beam-Target Neutron Generator

Daniel L. Jassby

The JET (Joint European Torus) is the world’s largest operating tokamak and the only magnetic confinement fusion device presently equipped to use radioactive tritium [1].  The JET project made headlines in early 2022 by stating that it produced a record high fusion energy output per pulse of 59 MJ [2], but a similar yield was actually attained by Phoenix Nuclear in 2019 using fundamentally the same technique of beam-target fusion reactions [3].

The most accessible fusion reaction is the D-T (deuterium-tritium) reaction that releases 80% of its energy in the form of high energy neutrons. For 60 years, oil exploration firms and others have used neutron activation procedures based on small D-T neutron generators consisting of 100 to 200-keV D beams striking a tritiated metal target [4].  This technique gives a fusion energy gain Q of 0.001, where Q is the ratio of fusion energy output to deposited ion beam energy.  The solid-target system culminated with the Lawrence Livermore Lab’s RTNS [5], commissioned in 1978 and used for 9 years for radiation damage testing. 

In the last decade Phoenix Nuclear (now absorbed by Shine Medical) increased Q to 0.004 by using a pure-tritium gaseous target with a 200 to 300 keV deuteron beam [3]. The efficiency of a gas target can be up to 4 times that of a metal tritide target because there are no collisions of beam ions with non-reacting nuclides [6].  In 2019 Phoenix operated its system at 4.6x1013 n/s (125 fusion watts) continuously for 132 hours, producing 61 MJ of fusion energy [3], remarkably similar to the recent JET result.  Table 1 summarizes the properties of the largest D-T neutron generators.

JET’s 2021 D-T fusion campaign was its first since 1997.  In numerous “shots” (i.e., pulses) the JET facility injected some 30 MW of 120-keV D and T beams along with 4 MW of radiofrequency power into a tokamak plasma mixture of D and T.  The injected power was 40% higher than in 1997.  Figure 1 shows the waveforms for the 1997 (DTE1) and 2022 (DTE2) record-yield shots, first presented at a press conference in Feb. 2022 [2]. The 1997 shot utilized 50:50 D-T beams and target plasma [7].  To obtain the highest neutron yield in 2021, experimenters injected solely D beams into a predominantly tritium plasma target with an equilibrium D:T composition of about 20:80 [8].  Integrated over the entire 6-s pulse, reactions among the thermalized D and T ions (thermonuclear reactions) were less than 1/4 of the total reaction rate, with more than 3/4 of the rate provided by deuteron beam ions reacting with tritons in the thermalized plasma (i.e., beam-thermal reactions).   The average fusion power during the pulse was 11 MW, so that Q = 0.33 [2]

Figure 1. Fusion power calculated from the measured neutron emission rate of two highly publi-cized D-T plasma pulses of the JET device in 1997 (DTE1) and 2021 (DTE2).  Taken from Ref. 8.

As shown in Fig. 1, the fusion power Pf in the 1997 shot rose steadily through the 5.1 s of 20 MW of beam injection.  Analysis showed that this rise was due to a gradual increase in the thermonuclear reaction rate, which never attained more than 40% of the total [9].  In the 2021 shot Pf dropped monotonically during the 6.0 s of 30 MW beam injection, with a pogostick-like overlay, both characteristics indicating that the beams and/or plasma were suffering serious stress.  Impurity ion influx, plasma instabilities, and glitches in the neutral-beam power were mainly responsible for the falling Pf and its major oscillations. In a subsequent high yield shot [10], Pf stayed constant for 2 s, then collapsed even more rapidly than the shot in Fig. 1. The descending Pf belies the claim that JET’s fusion output could be maintained indefinitely if JET’s resistive electromagnets were replaced by superconducting magnets.

Comparison of Beam-Target Neutron Sources

Table 1 summarizes the power relations of the 3 types of beam-target sources.  JET’s fusion Q is some 80 times larger than Phoenix’s because the injected neutral beams are ionized and magnetically confined in the plasma, reacting with the target tritons during the entire 150-ms beam slowing-down time, while the Phoenix beams just pass once through the target tritium gas.  JET’s fusion power was 85,000 times larger than Phoenix’s because JET’s beam power was 900 times greater and fusion gain Q about 85 times larger, enabling JET to achieve in 5 seconds what Phoenix produced in 5 days, albeit at more than 100 times the capital cost!  In terms of total electric power consumption per neutron produced, JET is only 7 times more efficient than Phoenix, mainly because of the power drain of its copper electromagnets.

Table 1.  Characteristics of Beam-Target D—>T Neutron Generators

RTNS Phoenix JET
Target medium solid gas plasma
Beam Energy (keV) 360 300 120 & 60
Beam Power (kW) 60 35 30,000
Fusion Power (kW) 0.10 00.13 11,000
Neutron Flux (n/cm2/s) 1010-1012 1010-1011 1013
Fusion Q 0.0015 0.004 0.33
Power Consump. (kW) 120 60 800,000
Electric power consump. per watt fusion (W) 1,200 470 73
Record Fusion Yield per Pulse (MJ) 32 61 59

Thus JET’s most promoted 2021 shot can be viewed as the Phoenix neutron generator on steroids.  While Phoenix took 5 days instead of 5 s to produce the same fusion yield, Phoenix can repeat this output every 5 days.  In contrast, tritium use in tokamaks is a rare and precious endeavor, and many shots are expended on “conditioning” plasma-facing surfaces, so that JET produced a handful of high-yield shots only once every few days at most.  Thus JET’s average output in megajoules per day was not so different from the Phoenix performance. 

The tritium burnup fraction in JET is typically only 0.01%, so that 99.99% of the injected tritium must be pumped out as well as scavenged from surfaces throughout the entire reaction vessel and every appendage.  In the Phoenix generator, any tritium that strays from the gaseous target is immediately trapped and recirculated.

The beam-target technique was the original basis of Princeton’s TFTR facility, and the fusion output of the beam-driven TFTR plasma was generally dominated by beam-thermal reactions [11].  Thus JET (and the late TFTR) are the plasma-target versions of solid and gas-target neutron generators, as are numerous smaller beam-heated tokamaks that have used only D-D reactions. 

Why does all this matter?

JET is commonly presented as a proto-thermonuclear power reactor, but it generally performs as a glorified beam-target neutron generator.  The predominantly beam-target operating mode is limited theoretically [12] to Q < 2, and is useful only for developing a fusion neutron source.  This mode has little to do with advancing the prospects for electricity from magnetic confinement fusion, which requires a Q of at least 5 for net electricity production and at least 10 to be practical, and such energy gains can be realized only with thermonuclear reactions.

While JET has produced plasmas where more than 50% of the reactions were thermonuclear, the total fusion output and Q were much less than those of shots dominated by beam-thermal reactions.  The maximum thermonuclear Q in JET’s 2021 D-T campaign was around 0.2.  It’s not credible that the 59 MJ pulse and similar shots with a thermonuclear Q near 0.1 “demonstrate powerplant potential,” as the press release claimed.  Despite rosy computer predictions, it’s not yet clear whether any tokamak can advance beyond the level of a low-Q beam-driven neutron source [13]

Is there a practical application for a low-Q tokamak neutron source? 

The world has an unmet demand for neutrons for research and applications such as isotope production [14].  If the tokamak enterprise could meet the challenge of producing continuous, non-degraded neutron output, perhaps R&D enterprises could eventually support a half dozen such “volume neutron” generators worldwide.  The aggregate total fusion power would amount to no more than 100 MW, consuming 5 kg of tritium annually in continuous operation.  That amount could be furnished by the naturally occurring tritium production in the heavy water of all the world’s CANDU reactors [15], provided that all of it is extracted.  Another crucial challenge is to vastly increase the egregiously low tritium utilization efficiency of 0.01% that characterized JET and TFTR, as the unburned 99.99% of injected tritium cannot be completely recovered from the plasma-facing components.

The operating Q is not critical if one is willing to trade electricity for neutrons, but even with Q close to 1.0 each facility would consume nearly 100 MW(e) when using superconducting magnets. 

Another potential application is the fusion-fission hybrid reactor, where the fusion neutrons enter a surrounding blanket containing natural or depleted U, and instigate a chain of fission reactions.  Energy multiplication can be a factor of at least 10 while the assembly is still very subcritical.  Here too Phoenix-Shine is actually pioneering the hybrid reactor technique by deploying a uranium solution surrounding its gas-target fusion neutron source in a molybdenum-99 production facility currently nearing completion [16].  The purpose is not energy multiplication but simply to maximize fission reactions per fusion neutron, because molybdenum-99 is one of the products of fission reactions.

Tokamak promoters should heed Yogi Berra’s philosophy:  “If you don’t know where you are going, you will end up someplace else.”

Daniel L. Jassby
Retired from Princeton Plasma Physics Lab

[1] P. H. Rebut, et al., “The Joint European Torus,” Nucl. Fusion 25 (1985) 1011-1022.

[2] Daniel Clery, “European fusion reactor sets record for sustained energy,” Science, Feb. 9, 2022 (online). doi:10.1126/science.ada1098.

[3] “Phoenix and SHINE achieve new world record,” Globe Newswire, Oct. 2, 2019 (online); see Wikipedia entry for “Phoenix (Nuclear Technology Company).”

[4] C. E.  Moss, et al., “Survey of Neutron Generators for Active Interrogation,”  LANL report LA-UR-17-23592 (2017).

[5] D. W. Heikkinen and C. M. Logan, “RTNS-II: Present Status,” IEEE Trans. Nucl. Science Vol. NS-28, April 1981.

[6] H. H. Barschall, “Intense Sources of Fast Neutrons,” Ann. Review of Nuclear Science 28 (1978) 207-237.

[7] M. Keilhacker, et al, “High fusion performance from D-T plasmas in JET,” Nucl. Fusion 39 (1999) 209.

[8] “JET makes history, again,” ITER Newsline 14 Feb. 2022, iter.org (online).

[9] R. V. Budny, et al, “Local transport in JET high-confinement mode plasmas,”  Phys. Plasmas 7 (2000) 5038.

[10] M. Nocente, et al., “Fusion product measurements in the JET Deuterium-Tritium-2 campaign,” High Temp. Plasma Diag. Conf., May 2022, paper IA-01.

[11] R. V. Budny, et al., “Simulations of deuterium-tritium experiments in TFTR,”  Nucl. Fusion 32 (1992) 429.

[12] D. L. Jassby, “Neutral-Beam-Driven Tokamak Fusion Reactors,” Nucl. Fusion 17 (1977), 309.  doi:10.1088/0029-5515/17/2/015

[13] Daniel Jassby, “The Quest for Fusion Energy,” Inference, Vol. 7, No. 1 (May 2022). doi: 10.37282/991819.22.30

[14] David Kramer, “Interruption at U.S.’s most productive neutron source,”  Physics Today 74 (Oct. 2021), 26;  “Neutrons for the Nation,” APS Panel on Public Affairs Report, American Physical Soc., July 2018.

[15] M. Coleman and M. Kovari, “Global Supply of Tritium for Fusion R&D,”  IAEA Report IAEA-CN-258/FIP/P3-25 (2018).

[16] David Kramer, “DOE medical isotope campaign nears completion,” Physics Today 75 (Feb. 2022) 24;  Tami Freeman, “Record breaking fusion reaction could transform medical isotope production,” Physics World, Oct. 4, 2019 [online].


Science, Technology, and Innovation for Sustainable Development

E. William Colglazier

Sustainable development grew in popular usage with the environmental movement in the 1970s.  It was often associated with concern about environmental degradation with concepts like “limits to growth” and “tragedy of the commons.”[1]  The most widely-used definition came from the 1987 United Nations report Our Common Future colloquially known as the Brundtland report: “Sustainable Development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.”[2] The report emphasized three components of sustainable development: environmental protection, economic growth, and social equity.

In 1992 the United Nations Conference on Environment and Development in Rio de Janeiro, the Earth Summit, promoted sustainable development with its Rio Declaration of principles as a guide for countries.[3]  The Earth Summit also led to the Climate Change Convention and opened for signature the Convention for Biological Diversity. Eight years later in the year 2000, all member states of the UN agreed on Millennium Development Goals for making progress over fifteen years to help developing countries.[4] The eight goals with their targets focused on eradicating extreme poverty and hunger; achieving universal primary education; promoting gender equality and empowering women; reducing child mortality; improving maternal health; combating HIV/AIDS and other diseases; ensuring environmental sustainability; and building new partnerships for development.  Progress on many of these goals was indeed made over the following decade, but much remained to be done when the member states began negotiations on what new goals might follow.

In September of 2015, one hundred and ninety three member states adopted seventeen Sustainable Development Goals (SDGs) as the key elements of what would become the UN 2030 agenda.[5] The goals encompassed a broad social, economic, environmental agenda serving the five P’s: PEOPLE (no poverty, zero hunger, good health and well-being, quality education, gender equality); PROSPERITY (clean water and sanitation, affordable and clean energy, decent work and economic growth, industry, innovation, and infrastructure, reduced inequalities); PLANET (sustainable cities and communities, responsible consumption and production, climate action, life below water, life on land); PEACE (peace, justice, and accountable institutions); and PARTNERSHIPS.  A number of advanced countries, including the United States, advocated for a small number of goals saying too many would lead to no priorities.  Their concern may have been motivated in part to avoid more financial assistance demanded by developing countries.  The large number of goals, however, prevailed. As one observer from the business sector said, the seventeen SDGs approved by the UN in 2015 should be seen as a significant gift to humanity, similar to what can be said about the Universal Declaration of Human Rights proclaimed by the UN in 1948.[6]

The SDGs can be viewed as an aspirational, practical, and political definition of sustainable development applying to all countries.  The goals have longevity to 2030, and they reflect value judgments made specific with 169 targets and 248 indicators for gauging progress.  They aim for advancing all the SDGs together by maximizing synergies and minimizing tradeoffs.  They call for “no one to be left behind” and for mobilizing stakeholder engagement. The sixteenth goal focuses peace, justice, and accountable institutions and the seventeenth on partnerships, but there is no goal for democratic governance. 

While I was serving as the Science and Technology Adviser to the Secretary of State (2011-14) during much of the period when the SDGs were being negotiated, I was not directly involved in the negotiations.  However, I was contacted by UN professional staff at the UN to be involved in expert meetings regarding the role of science, technology, and innovation (STI) for sustainable development.  They had been directed to investigate this topic because it had become likely that the UN 2030 Agenda would contain a component focused on STI for the SDGs.   The real reason I was asked was because they had read a report from the U.S. National Academies of Sciences, Engineering, and Medicine entitled Our Common Journey: A Transition Toward Sustainability.[7]  It was produced in 1999 when I was executive officer of the National Academy of Sciences.   The title was a play on words of Our Common Future to indicate that achieving a more sustainable world would be a journey.  Recognizing that science, technology, and innovation must have an important role, the report emphasized creating a partnership between scientific communities and societies that would engender a journey of learning and doing, adaptive management and social learning, in addressing global goals. As the report stated, “Any successful quest for sustainability will be a collective, uncertain, and adaptive endeavor in which society’s discovering of where it wants to go is intertwined with how it might get there.” It reinforced using knowledge “intelligently in setting goals, providing needed indicators and incentives, capturing and diffusing innovation, carefully examining alternatives, establishing effective institutions, and, more generally, encouraging good decisions and taking appropriate actions.”[8]

The part of the UN 2030 agenda focused on science, technology, and innovation was named the Technology Facilitation Mechanism (TFM). The diplomats that created it did not know how best to harness STI, but knew STI was important if sustainable development would be achieved.  The TFM was charged to make progress, and it had three components: an annual multi-stakeholder STI Forum held annually at the UN, a Ten Member Group (TMG)  of advisors appointed by the UN Secretary General for two year terms, and an Inter-Agency Task Team (IATT) with one representative from each of more than forty international agencies.  I was co-chair of the 10MG from 2016-18.  In the 2017 STI Forum, we emphasized five actions: (i) using system analysis to help maximize synergies and minimize tradeoffs among the independent SDGs, (ii) emphasizing STI capacity building in each country including building human capacity and strengthening the science advisory ecosystem, (iii) creating action plans and roadmaps that incorporate STI into national planning efforts, (iv) expanding involvement through public-private partnerships and efforts that help create business opportunities in pursuing the SDGs, and (v) conducting “deep dives” on every SDG because only a comprehensive approach with stakeholder participation can yield realistic roadmaps at the global, regional, national, local, and institutional levels.

One initiative that the 10MG championed was the creation of STI for SDGs roadmaps at the national level.  The idea of the STI for SDGs roadmap was to focus on the intersection of three plans often produced independently in countries. One was the national plan of the country emphasized by political leaders, another was the STI capacity building plan often championed by science and technology agencies, and the third was the fledging national plan for making progress on selected SDGs.  To advance this initiative, an IATT workgroup on roadmaps was created in 2017, and a guidebook preparation was begun in 2018.[9] Five pilot countries were announced in 2018 to prepare their STI for SDGs roadmaps. These countries had approval at the highest level of their governments and assistance and funding promised by several developed countries and international agencies.  In 2021 the pilot countries expanded to six, including Ethiopia, Ghana, India, Kenya, Serbia, and Ukraine. 

A number of global reports were produced by groups of scientists focusing on how to make progress on STI for the SDGs.  They included the World in 2050 reports emphasizing six big areas where transformations and focus were needed: human capacity and demography; consumption and production; de-carbonization and energy; food, biosphere and water; smart cities; and the digital revolution.[10] Another was the Global Sustainable Development Reports that would be produced every four years by fifteen independent scientists appointed by the UN Secretary General. The first GSDR appeared in 2019 entitled “The Future is Now: Science for Achieving Sustainable Development.”[11]

Even during those early years of the UN2030 Agenda, it was recognized there are many challenges to making progress on the SDGs.  Aspirational rhetoric is inspiring, but effective policies, real action, and adequate funding are hard to implement and sustain.  The targets do not cover all the essential elements.  Many key indicators are either missing or lack adequate data.  The voluntary national reviews submitted by member states to the UN High-Level Political Forum are useful, but not real actions plans.  And these challenges are minor compared to the enormous political challenges at the national and global level that create roadblocks to progress to each and every one of the SDGs. Even the incredibly rapid progress of the science and technology revolution, yielding a growing number of emerging technologies, is seen not only for creating potential new opportunities but also for possibly creating powerful disruptions and security threats to societies.

Should we be optimistic or pessimistic given the last 2.5 years? The UN Secretary General’s speech to world leaders at COP26 in November 2021 laid out the stark choices regarding climate change.[12]  The challenges of the covid pandemic, the Russian invasion of Ukraine, the worsening relations between the U.S. and China, the retreat from globalization, and the impending economic recession worldwide have created heavy headwinds for making progress on the SDGs.  Two of the pilot countries for STI for SDGs roadmaps, Ethiopia and Ukraine, are involved in conflicts that have disrupted their plans for making progress on sustainable development, illustrating once again that wars may be the greatest threat to the UN 2030 agenda.

Yet efforts are continuing on the SDGs with some of the most promising initiatives at the subnational and local level.  Although the United States has not produced a Voluntary National Review (VNR), Brookings and the UN Foundation have produced in 2022 The State of Sustainable Development Goals in the United States as a shadow VNR.[13]  The National Academies of Sciences, Engineering, and Medicine has initiated a study now underway on “Operationalizing Sustainable Development” highlighting some of the constructive initiatives at the local and global level.[14]  The U.S. rejoined the Paris Climate Agreement, and many countries have emphasized their commitment to reducing greenhouse gas emissions.  The STI Forums continue.  Influential public communicators such as the young activist Greta Thunberg and the science fiction writer Kim Stanley Robinson are reaching out to young people around the world. Yet the impact of the global conflicts now underway cannot be minimized.

My own view is our greatest legacy to future generations, besides avoiding wars, terrorism, and conflicts, will be building knowledge-based societies and accelerating expansion of scientific knowledge and useful technologies.  The 1987 Brundtland Report could have highlighted the important role of science, technology, and innovation by defining sustainable development as meeting the needs of the present while expanding the ability of future generations to meet their own needs.  We must all remain optimistic that humanity has the power, and hopefully the wisdom, to use science, technology, and innovation to create a more sustainable world for the benefit of all people and the planet.

E. William Colglazier
Senior Scholar, Center for Science Diplomacy,
American Association for the Advancement of Science


[1] https://www.clubofrome.org/publication/the-limits-to-growth/ and https://www.science.org/doi/10.1126/science.162.3859.1243

[2] https://sustainabledevelopment.un.org/content/documents/5987our-common-future.pdf

[3] https://www.un.org/en/development/desa/population/migration/generalassembly/docs/globalcompact/A_CONF.151_26_Vol.I_Declaration.pdf

[4] https://www.un.org/millenniumgoals/

[5] https://sdgs.un.org

[6] Peter Bakker, President and CEO of the World Business Council for Sustainable Development, 2016.

[7] https://nap.nationalacademies.org/catalog/9690/our-common-journey-a-transition-toward-sustainability

[8] op.cit. page 3

[9] https://sdgs.un.org/sites/default/files/2021-06/GUIDEBOOK_COMPLETE_V03.pdf

[10] https://previous.iiasa.ac.at/web/home/research/twi/Report2018.html

[11] https://sustainabledevelopment.un.org/globalsdreport/2019

[12] https://www.un.org/sg/en/content/sg/speeches/2021-11-01/remarks-the-world-leaders-summit-cop26

[13] https://sdg.iisd.org/news/brookings-brief-calls-for-stronger-us-leadership-on-sdgs/

[14] https://www.nationalacademies.org/our-work/operationalizing-sustainable-development


They Knew: The Federal Government’s Fifty-Year Role in Causing the Climate Crisis.

by James Gustave Speth, The MIT Press, Hardcover $28, 304 pages, ISBN 978-0-262-54298-2. 

Retired law professor James Gustave Speth chaired the US Council on Environmental Quality under President Carter, cofounded the World Resources Institute, cofounded the Natural Resources Defense Council, and for six years administered the UN Development Programme.  To understand his new book, one needs to know its context.  This begins in 2010 when, inspired by climate scientist James Hansen’s book Storms of my Grandchildren, Julia Olson founded Our Children’s Trust.  OCT is a public law firm to represent youth in lawsuits against governments to “save our children’s only planet from government-sanctioned climate destruction.”  In lawsuits filed in seven U.S. states and 20 foreign countries, OCT makes the case that governments have sanctioned an energy system dominated by fossil fuels.  One of these lawsuits was Juliana v. United States, filed in 2015.  An appendix written by Olson and Philip Gregory describes how governmental legal maneuvers have prevented this case from coming to trial thus far. 

The basis for Juliana is a state-created danger claim under the Constitution's due process clause, a claim that arises when the state (1) places the plaintiff in danger and (2) acts with indifference despite the known or obvious nature of that danger.  The lawsuit intends to establish that U.S. energy policies have placed children in a danger to which government has been deliberately indifferent, and seeks a court mandate that future energy policies reduce atmospheric carbon dioxide concentration to below 350 parts per million by year 2100. 

In 2018, Speth wrote an expert report for the Juliana lawsuit and will present it as a witness at the trial, if it occurs.  Meanwhile, OCT made Speth's report available to the general public by publishing it as the text of They Knew, updated to include the Trump presidency. 

Speth's book is a history of what has been learned, at the level of presidential administrations, about the climate effect of carbon dioxide from fossil fuel combustion.  The first chapter relates the era before Jimmy Carter, and the following seven chapters cover the following seven presidential administrations.

Despite what was known at the outset of the Carter Administration about the eventual climate impact of carbon dioxide from fossil fuels combustion, the more immediate energy problem was a shortage of oil for transportation, framed by the OPEC oil embargo of 1973-1974.  America was oil-poor but coal-rich.  Therefore, in addition to expanded drilling for oil on the Outer Continental Shelf, Carter established the Synthetic Fuels Corporation to facilitate production of oil-substitutes from coal and oil shale.  Neither measure made much impact on greenhouse emissions.  U.S. dependency on fossil energy was about 90 percent at the beginning of the Carter years, and declined by only 2-3 percent by the end.  Meanwhile, the fossil fuel mix was shifting toward coal and away from oil imports.  Although Democratic presidents displayed greater concern about the climate consequences of continued reliance on fossil fuels than the Republicans who preceded and succeeded them, Speth faults Carter and all his successors for this continued fossil fuel reliance.

Speth recounts some notable world events related to climate science.  During what he calls the “first Bush administration," the U.S. signed the UN Framework Convention on Climate Change, after weakening it.  The Convention commits its signatories to “stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system.”  The Kyoto Protocol, the first major agreement aimed at implementing the UNFCCC, was signed by roughly 190 countries in December 1997 in Kyoto, Japan.  In it, developed countries agreed to reduce their overall greenhouse gases by at least 5 percent below their 1990 levels by 2008-2012, with the US obligated to a 7 percent reduction.  

While it was being negotiated, Senators Robert Byrd and Chuck Hegel moved a sense of the Senate resolution that “no protocol was acceptable unless it included the developing countries and would not harm the U.S. economy,” which passed 95-0.  This proved to be a problem at the Kyoto conference.  Although Vice President Gore flew to Kyoto to try to salvage the negotiations, the developing countries balked at having targets to meet, arguing that the developed countries had emitted most of the greenhouse gases.  The U.S. Senate was never sent the Kyoto Protocol to ratify, and G. W. Bush withdrew from it in 2007.  Subsequently, President Obama also participated in international climate summits, culminating in the 2015 Paris Agreement, only to have Donald Trump withdraw from it two years later.

In Speth’s chronologies we can see how our understanding of climate science has evolved and expanded.  The 2007 Fourth Report of the Intergovernmental Panel on Climate Change said that “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.”  But the Fifth Report seven years later said that “Human influence on the climate system is clear [no longer “very likely”], and recent anthropogenic emissions of greenhouse gases are the highest in history.”  At the outset, the focus of the reported research was concern about global temperature and sea level, but observant readers can see that, around the time of the Clinton administration, the reports extend their coverage to extreme weather events and the living conditions for wildlife.  And, while generating electricity from renewables was a pipe dream during the Carter administration because of its cost, Speth cites a 2012 report from the National Renewable Energy Laboratory, which states that by 2050, "Electricity supply and demand can be balanced in every hour of the year in each region with nearly 80% electricity from renewable resources, including nearly 50% from variable renewable generation.”

Because Speth wrote the text of this book as an expert paper for the Juliana case, he closes his coverage of the Trump Administration with gratitude for Congressional action on behalf of his 21 youthful plaintiffs.  This action was a Concurrent Resolution to protect the fundamental rights of children to a climate system capable of sustaining human life and to support the Juliana plaintiffs in recognizing that the current climate crisis disproportionately affects the health, economic opportunity, and fundamental rights of children.  The resolution also demanded that the United States develop a national, comprehensive, science-based recovery plan to meet necessary emissions reduction targets. 

Because Speth wrote the text of They Knew as an expert paper for a court case, it is meticulously documented.  There are 43 pages of footnotes, 28 pages of references, and 25 pages of index.  The reader should also be conversant with such legal terms as “writ of mandamus,” “interlocutory appeal,” “summary judgment,” and “en banc.

John Roeder


Miseducation: How Climate Change is Taught In America

by Katie Worth, Columbia Global Reports (paperback and ebook), ISBN 978-1-7359136-4-3

This book contains only seven chapters plus an epilogue that summarizes current developments in public education science teaching.  Nevertheless this slender volume provides a thorough review of science as taught in American public schools;  I learned a lot about this topic by reading Worth's book.  The journalist author has interviewed students and teachers from public schools around the country and provided a lengthy reference list.  She accuses wealthy energy companies of acting like tobacco companies by downplaying the threat of climate change in educational materials just as big tobacco companies downplayed the hazards of smoking in free classroom materials.  There is little new anywhere, and this conscientious author traces her subject to the Scopes trial and controversies over Darwin’s Theory of Evolution.  She documents energy industry efforts to conflate human-caused climate change through burning fossil fuels with evolution as untrustworthy science in the minds of children.  Industry uses classroom visits by trained staff and distribution of free educational materials to students and their teachers.  

Worth points out that what public school students learn about climate change depends on the actions taken by school districts and state legislators which of course vary widely.  This tells readers that children may or may not learn much about climate change.  Even publishers of scientific textbooks shy away from climate change and what we should do about it, because of fear that controversy may upset students and parents.  Even the Next Generation Science Standards have been watered down.  The author presents several examples of textbooks that adhere to Texas standards and fail to adequately address climate-change science because their timid publishers fear loss of sales more than damage to Earth due to climate change.

Thus public school students may easily fail to understand that climate-change scientists agree, thanks to hard evidence, that humans have already caused climate change.  Experts agree:  It’s us and it’s real!  Climate change raises sea levels around the world and CO2 could poison the global atmosphere.  There is already evidence that carbon dioxide and methane are raising global temperatures and ocean levels:  It’s dangerous but there’s hope! 

I’m glad the book ends with hope.  As a physicist, I have read a lot about measurement techniques and hoped this volume would provide an update on the state of the data that is accumulating, but it relies strictly on anecdotes which admittedly make amusing reading.  As a whole, this slender volume is worth reading because it points out that climate science depends on public education.  Th Covid-19 pandemic has highlighted public distrust of science.  As physicists, we cannot afford to neglect public education in science.  Fortunately, we are all familiar with measurements and their value to all scientists and some of us are accustomed to inserting physics into politics.  Our value is in fact publically recognized.  This book is of value to those who are getting started in this important endeavor.

Ruth Howes