Archived Newsletters

From the Editor

Oriol T. Valls

In this July issue of the Forum newsletter we have our first ever (as far as I know) interview article. The interviewer is our Assistant Editor, Laura Berzak Hopkins, and the interviewee is David Saltzberg, well known for his work as technical director in the CBS comedy show “The Big Bang Theory”. The depiction of Physics professionals in this show has done much to change the public perception of physicists.

Several other articles in this issue deal with more traditional Forum topics. We have an article by Joshua Pollack on crisis stability: it is based on his presentation at the last March Meeting. Keivan Stassun is the author of the article on advancing minorities and women in PhD programs. We have also an article by Brian Carter on STEM education and one by Manheimer on solar power and climate change. Also, a news item on the March of Science. And our usual quota of book reviews.

Please note that we are still looking for a Media Editor: see the ad in this issue.

We plan to have something special in the October issue in honor of the 150th anniversary of Marie Curie’s birth.

Please continue to send articles and suggestions for articles. This newsletter is to a large extent reader driven. We are very open as to topics and welcome controversy, as I explained in the Editor’s note in the October 2016 issue for details.

Oriol T. Valls
University of Minnesota

Letters to the Editor

Lessons from the March for Science
I am a progressive Democrat. In college, my physics classes only occasionally took priority over my student organizing around marriage equality, and I have the grades to prove it. During the 2006 midterms, I made calls on behalf of Democratic Congressional candidates from my little apartment in the countryside of eastern France, in between assembling the parts of the ATLAS alignment system. When I finished my graduate classes and qualifying exams at the end of 2007, I took a leave of absence from school to direct campaign offices for progressive organizations including the Democratic National Committee. After I finally finished my Ph.D., I received a AAAS Science and Technology Policy Fellowship where I worked as a nuclear policy advisor in the office of Senator Markey (D-MA). With that background, it should be no surprise that I believe scientists need to have an active voice in politics.

However, I am probably the last person you would expect to be encouraging my colleagues to make a conservative case for science. Yet two months ago I found myself organizing 13 scientists and engineers, along with a team of editors, to draft, edit, and place op-eds discussing why we would be joining the March for Science using conservative themes, such as the feelings of patriotism produced by American leadership in science, and avoiding the topics like climate change.

The idea struck me while I was sitting in an extremely packed panel discussion at the recent AAAS annual meeting in Boston. The topic was defending science in a post-Trump world and the question of the political polarization of science was hot on people’s minds. Jane Lubchenco, former Administrator of NOAA and a marine ecologist, encouraged the audience not to make science partisan. Some expressed concerns that the March for Science would be interpreted by the media as a partisan event, with liberal elites protesting the Trump administration, and therefore should not be happening.

This was an argument I had been hearing from many in the science community since the day the march was announced. While it was fascinating to see everyone develop a sudden interest in political science and hypothesize about the political impact of the march, the reality was the experiment would be conducted either way.

Instead of joining the debate, I decided to work within the confines of the data. If the concern was that the March for Science would paint science as a liberal issue, then we should make sure the conservative case for the march was made. With that I set aside my weekends leading up to the march and developed a project to place op-eds written by scientists in local papers of the more conservative parts of the country.

Along with a few friends, we quickly put together talking points and a short guide to writing op-eds. We recruited our colleagues, their friends, and some students into the project. We offered to provide assistance with editing and placement; in exchange, they had to stick to an ambitious timeline and to the talking points.

When I started this project I thought, at most, we would get half of the pieces published. The week before the March we began submitting them to our hometown newspapers all over the country. Within a day, we had our first acceptance, from the Shreveport Times in Louisiana. Almost every day thereafter we had one or two more accepted for publication until, the day before the March, we reached 100% published.

The day of the March, I found myself on the rain soaked National Mall with thousands of people carrying all kinds of creative pro-science signs. To see so many science supporters all in one place was inspiring to say the least. Yet part of me was stuck wondering how the media was covering the event.

The crowd was so large that checking the news on my cell phone wasn’t an option. Instead, I found my way over to a CNN reporter who was occasionally taking live shots in front of the main stage. To my pleasant surprise he never described the event as a protest against the Trump administration, but focused on the crowd’s support for scientific research. And, from what I can tell, this was largely the story that came out of the day.

I’m sure our 13 op-eds were just a small drop in the bucket when it came to framing the media narrative about the March, but I think there are still some important lessons here. First, the media is clearly hungry to hear the voice of scientists and we should be taking advantage of that to tell our stories. Second, because science is not a partisan issue, we can make a case for it from any political perspective. Most importantly, the March for Science showed us we can even put some passion behind our political arguments without making science a polarized issue.

Dan Pomeroy
Staff, MIT International Policy Lab

2018 April Meeting Forum

At the 2018 APS April meeting the forum is sponsoring Session C06 “Nuclear Weapons and Ballistic Missile Defense” . Joel Primack will be the chair. The venue is room B130 and the session goes from 1:30 to 3:18, on Saturday April 14. The titles and speakers are:

“North Korean Long-Range Ballistic Missiles and US Missile Defenses” by Ted Postol

“Missile Defense and Space Weapons” by Laura Grego

“US Nuclear Weapons Modernization” by Roy Schwitters

Science on Television: Entertaining, Inspiring, Accurate

Laura Hopkins

Written by Laura Berzak Hopkins, Associate

Editor of this newsletter, and Design Physicist at Lawrence Livermore National Laboratory, in conversation with David Saltzberg, Professor at UCLA and main scientific consultant for “The Big Bang Theory”

Fade In:

Int. Laura’s Kitchen – Day

On a quiet afternoon, Laura and her family are talking about tv shows over lunch.

Laura's Father – “I just don’t like the characters; they’re not realistic.”

Laura – “Dad, seriously?? These characters are spot on! I work with ‘Sheldon’!”

Fade Out:

This would be the screenplay describing my family’s conversation about the extremely popular television comedy, “The Big Bang Theory”. Now, if you don’t know the characters, I highly recommend catching a clip, and I guarantee that you will either know a ‘Sheldon’ or perhaps be a ‘Sheldon’, not that there’s anything wrong with that. The characters are researchers at CalTech, full of the quirks and quips that we as physicists all know. The show itself is a comedy – not intended to accurately represent all science or all scientists (Back to the Future doesn’t exactly get the science right, but it’s still a classic and beloved movie). But what’s particularly great is that Big Bang Theory isn’t just a comedy where the backdrop is science, instead science is woven within the storylines and character development in a way that evolves as the show’s characters evolve and develops in a way that’s both entertaining, engaging, and pretty accurate.

We (by which I mean all physicists) can thank David Saltzberg for this endearing and engaging portrayal. Saltzberg is a particle physicist at UCLA who collaborates at CERN and with the US Antarctic Program searching for high energy particles (TeV or EeV levels). But, wearing his other hat, he is the main scientific consultant on the Big Bang Theory, consulting each week on the upcoming episode. “I never expected to be drawn into show business” Saltzberg comments, but as he notes, one role of a University is to help the local community and local industry. It just so happens that for UCLA, the local community is Hollywood, and the local industry is the entertainment business.

David Saltzberg

Despite the seemingly polar opposite nature between a lab and movie set, Saltzberg has noted striking similarities – “a sound stage is basically the same as a high bay, without the cranes”. Essentially, a movie or tv set is an empty box where the work gets done. There’s the equivalent of a PI – head writers making the final call on creative decisions; technicians working on the electrical, carpentry, painting, and producers who are organizing it all, essentially in the project manager role. The end product may be different, but it’s still people who are putting it all together, and so the process has developed along a parallel track of organization. Moreover, each department is filled with people who have decades of experience and have gone through their own trials and tribulations to get to their current position. Saltzberg highlights the dedication of the writers, who aim to nail down each aspect of a scene and character in order to portray an accurate representation, one that can draw the audience in and convince them of the characters and their interactions. Experimental science has much of the same with successful teams built over time comprised of dedicated and passionate people working toward specific goals.

With Big Bang Theory under his belt, Saltzberg has expanded his consulting to include a new role with an alternate project, Manhattan. This series is a fictionalized account of the scientists who worked on the Manhattan Project, including the roles of laboratories like Los Alamos, Hanford, and Oak Ridge. For this series, Saltzberg’s contribution is of a different flavor – as opposed to advising on including contemporary science to keep plots interesting, here are complex story lines which need specific details about the gadget being built. Instead of looking through all of modern physics for inspiration, he needed to more deeply know a smaller subset of high energy density physics. While Saltzberg didn’t start out as an expert in 1940s weapon physics, he was able to quickly get up to speed because his science background provided the framework to learn the new physics. In doing so he demonstrated an important skill – a skill that we as researchers often don’t recognize that we even have – the ability to be faced with a question to which we do not know the answer and to forge forward with, need I say it, researching until we develop an answer or at least a hypothesis for how to develop an answer.

Saltzberg notes that feedback from colleagues on his roles with the various shows has shifted from skeptical to highly positive. Initially, there was concern over how scientists would be portrayed – even my father (someone who has a physicist as a daughter) doesn’t have much of a view of the personality side of scientists. Each character in any single episode might be one-dimensional, but over time, they develop; a story can’t be told solely with one-dimensional characters. Saltzberg notes, “For Big Bang Theory, it’s a comedy; it isn’t intended to be about perfect people, and the show has great writing and acting.” It all comes together to be about interesting, relatable people doing interesting science, which starts to become more relatable over time as well.

For as many personalities as there are within physics, there are as many ways to be involved with society and with communicating why what we as scientists do is so exciting and important. This can be as simple as having conversations with non-scientist coworkers about the Astronomy picture of the day. Or, for Saltzberg, it’s become a unique combination of an active researcher and a scientific consultant for television shows. Our image as scientists is in good hands, as is the search for ultra high energy particles.

Is Crisis Stability Still Achievable?

Based on talk given at the APS March meeting, New Orleans, March 16, 2017

Joshua H. Pollack, James Martin Center for Nonproliferation Studies (CNS), Middlebury Institute of International Studies at Monterey (MIIS)

The Cold War Origins of Crisis Stability
“Stability” is a central concept, or family of concepts, in the analysis of nuclear posture. The Obama administration’s 2010 Nuclear Posture Review, for example, described “the challenge of ensuring strategic stability” between the United States and Russia, and between the United States and China, to be a “familiar” problem, to be addressed alongside more pressing concerns, particularly the dangers of nuclear proliferation and nuclear terrorism.1 In the intervening years, proliferation and terrorism threats have arguably become less acute.2 The problems of stability appear to have grown.

Strategic stability is traditionally understood in terms of two concepts: “crisis stability,” also known as “first-strike stability,” and “arms-race stability.” Broader applications have also been suggested, but these two basic ideas have persisted.3 The metaphor of stability appears to have arisen from economics, in particular from game theory.4

The locus classicus for crisis stability may be a paper written in 1958 by the economist Thomas Schelling, “The Reciprocal Fear of Surprise Attack,” which later appeared as a chapter in his celebrated book The Strategic of Conflict (1960). It hypothesizes a dynamic process of compounding fears of an enemy’s attack that place great pressure on each of two parties in a confrontation to strike the first blow themselves, even if they would prefer no violence at all:

This is the problem of surprise attack. If surprise carries an advantage, it is worth while [sic] to avert it by striking first. Fear that the other may be about to strike in the mistaken belief that we are about to strike gives us a motive for striking, and so justifies the other’s motive. But if the gains from even successful surprise are less desired than no war at all, there is no “fundamental” basis for an attack by each side. Nevertheless, it look as though a modest temptation on each side to sneak in the first place — a temptation too small by itself to motivate an attack — might become compounded through a process of interacting expectations, with additional motive for attack being produced by successive cycles of “He thinks we think he thinks we think … he think we think he’ll attack; so he thinks we will; so he will; so we must.”5

In the realm of nuclear weapons, the United States and Soviet Union took measures that militated against this danger, both unilaterally and cooperatively. Unilaterally, each reduced the other side’s ability to act in “preclusive self-defense” (Schelling’s phrase) by making their own weapons difficult to destroy (through “hardening” in concrete silos) or difficult to locate (through forms of mobility and stealth). Analogous measures have been taken to protect national decision-makers during a crisis, involving either underground bunkers or aircraft. Cooperatively, the superpowers reached arms control agreements to ensure crisis communications between leaders, limit the numbers and types of nuclear weapons and delivery systems on either side, and provide each side with “transparency,” meaning access to information about the basing and composition of the other side’s forces, and therefore greater confidence in the other side’s compliance with agreements.6

At the same time, because military establishments are responsible for preparing to fight wars, they have tended to take steps that would improve their ability to destroy enemy forces, e.g., improved accuracy, which contributes to instability by improving the chances of a successful first strike, ceteris paribus. This problem reflects the enduring tension between posture choices designed to reduce the level of harm to one’s own country in the event of war (“damage limitation”) and posture choices designed to reduce the chance of an unthinkably destructive war.

Even some measures that are stabilizing by intention may perversely have the opposite effect. Preparing to launch vulnerable missiles from silos upon receiving warning of an inbound attack should discourage an adversary from contemplating such an act, but also creates a risk that a false warning will trigger a nuclear war.

These observations may provide some sense of the complexities of ensuring crisis stability; each new technological development or modernization of the arsenal may bring about a change in the calculations of each side. To compound the difficulty further, the two sides’ calculations are not identical, something that is perhaps only to be expected in a situation fundamentally premised upon mutual mistrust. Crisis stability, as discussed here, is American in its origins and development, although the two sides have had ample chances to exchange views in negotiations. This process began with inconclusive talks in the late 1950s about how to address the problem of surprise attack, and returned with a vengeance after the 1962 Cuban Missile Crisis. Russian-American exchanges continue in one form or another to this day.

Three Post-Cold War Complications
Despite the various complications discussed above, the reciprocal fear of surprise attack is, at its core, a simple and elegant construct, involving two actors and one type of arms. Schelling’s 1958 paper famously opens with an analogy: a homeowner and a burglar have come upon one another, guns drawn. Both might prefer that the burglar simply withdraw, but they are caught in a down spiral of mutual fears that may well lead to violence. Although this was always a simplification, in the minds of many analysts, it distilled the essence of Soviet-American nuclear confrontation and the problem of crisis stability. For at least three reasons, the same simplicity no longer prevails.

First, new non-nuclear military technologies have become “entangled” with nuclear arms and with each other. These include ballistic missile defenses (BMD), counter-space weapons, and strategic conventional weapons.7 For example, American BMD architecture relies heavily on space-based sensors. Any adversary that is concerned that BMD may enable American nuclear threats or nuclear attack by negating its own ability to retaliate may be tempted in a crisis to use counter-space weapons to “blind” BMD. Furthermore, nuclear-armed states are developing dual-use missile systems — both nuclear and conventional — or dedicated intercontinental-range conventional weapons (so-called “conventional prompt global strike,” or CPGS systems) capable of precise strikes. The possession of these arms may create a temptation to try to disable an enemy’s counter-space weapons; indeed, this role has been one of the most plausible and oft-cited justifications for acquiring CPGS in the United States. Other roles may include a “defense-suppression” mission, targeting the enemy’s long-range radars, which enable BMD, air defense, and coastal defense.8 Notably, space-based sensors and long-range radars are also the mainstays of “early warning” against nuclear attack. Attacks on these systems could be interpreted as a prelude to a nuclear first strike. So, too, could attacks on multi-purpose command-and-control systems. Furthermore, attacks on dual-use missile systems could be interpreted as targeting a state’s nuclear-delivery capabilities. Conventional weapons might even be employed purposefully against nuclear targets.

Second, it is no longer useful to model the nuclear-weapons “environment” as a system composed of two nodes. As additional states with mistrustful relations have developed nuclear weapons and long-range delivery systems, several new “dyads” have emerged, beyond (1) America/Russia. These include, at a bare minimum, (2) America/China, (3) America/North Korea, (4) India/Pakistan, and (5) India/China. (I will omit Britain and France, while acknowledging that a Russian analyst probably would not do so.)9 Fortunately, it is still reasonable to model these five dyads as parallel systems, and not as fully enmeshed with each other. Unfortunately, two nodes (America, India) are linked to more than one other node. These corresponding nodes (Russia, China, and North Korea, in the American case; Pakistan and China in the Indian case) are dissimilar from each other, so whatever decisions that America or India make on strategic posture may simultaneously have a variety of effects in the calculations of potential adversaries.

Third, even if adversarial dyads may be modeled as discrete links between pairs of nodes, three-sided interactions are already a fact of life. These interactions may involve three nuclear-armed states, or two nuclear-armed states and a third state with its own “entangling” weapons systems.

An example of the first type of triangular interaction is America/North Korea/Russia. The United States has deployed the Ground-based Midcourse Defense (GMD) system against the threat of North Korean intercontinental ballistic missiles (ICBMs). GMD’s kinetic-kill interceptors are based primarily in Alaska, with a handful in California. Any ICBM flying out from North Korea toward the continental United States would pass over the Russian Far East. To attempt an intercept of a single North Korean ICBM, four or five of the Ground-Based Interceptors (GBIs) would have to fly out in the direction of Russian airspace. Depending on the details, the intercept engagement could take place over Russia; regardless, the excess interceptors would either overfly Russia or reenter the atmosphere inside Russian airspace. Russia’s early warning radars presumably would detect the inbound interceptors. While the nature of the event would hopefully be clear to the Russian military, there is already ample precedent for false warnings.

An example of the second type of triangular interaction involves the United States, North Korea, and South Korea. North Korea has threatened to use its nuclear missiles against the ports and airfields in South Korea that would receive American reinforcements in the event of war. American defense officials have spoken of developing “left of launch” capabilities to deal with threats of this type, presumably meaning “conventional counterforce” options, among other things. South Korea does not have its own nuclear weapons, but it has developed an arsenal of conventionally armed ballistic and cruise missiles capable of striking anywhere in North Korea. South Korea aspires to develop and deploy what it calls a “Kill Chain” system: a network of sensors that will permit its missile forces to attack North Korea’s nuclear missiles before they can launch against South Korea. In short, North Korea has a strategy of using nuclear weapons first, and both the United States and South Korea are separately pursuing capabilities to preemptively attack North Korea’s weapons. Arguably, South Korea’s conventional first-strike option will mainly have the role of pushing the United States toward conducting its own first strike against North Korea, in the hopes that it would be more effective than South Korea’s.

To make matters still more complex, calls have been heard in Japan to develop conventional counterforce options there as well. This raises the prospect of a four-way interactions involving two nuclear-armed parties and two non-nuclear-armed parties.

In summary, over the span of the last two decades or so, we have moved from a frightening but relatively manageable picture involving two states and their nuclear arsenals to a very hairy situation involving a variety of interacting weapons systems distributed across at least five dyads, with hydra-headed interactions starting to emerge.

Potential Responses
Short of cutting the Gordian Knot and implementing global nuclear disarmament, which has eluded humanity for the last seven decades, what is to be done? Here, we might think back to Cold War efforts at shoring up crisis stability, both unilateral and cooperative.

Unilateral measures might include voluntarily forgoing classes of new weapons, if they are judged too destabilizing to warrant deployment. This suggestion has already been made concerning CPGS, for example. Another possibility is to diversify military sensors and communication networks, to minimize nuclear-conventional overlap, to make these systems highly redundant, or to do both. This is probably an expensive proposition, but it may be worthwhile.

Cooperative measures may be more difficult to achieve in a multilateral setting than in a bilateral one, but they are still possible, as the success of several multilateral arms control and nonproliferation regimes demonstrates. Multilateral treaties to forgo certain classes of weapons, or even multilateral codes of conduct to forgo certain behaviors may have great value. The importance of orbital sensors and communications platforms makes outer space a natural place to focus efforts.

These are initial ideas only, but they point to a research agenda: how do we model crisis stability as has evolved, and continues to evolve? Are there key nodes in this system? Are they amenable to policy responses? As daunting as these problems may seem, this type of work has been our collective answer to the problems of survival since the start of the nuclear age, and it remains, in all likelihood, our best response.

Jason H. Pollack

Endnotes

  1. Department of Defense, Nuclear Posture Review Report, April 2010, pp. v, 5-6, 28-29.
  2. Reductions in these two risks might be credited to the negotiation of the Joint Comprehensive Plan of Action (JCPOA) with Iran and the Nuclear Security Summit process, respectively. On the diminishment of the proliferation threat, see: Leonard S. Spector, “A Proliferation Plateau May Offer Unique Opportunities,” Arms Control Today, April 2016.
  3. James M. Acton, “Reclaiming Strategic Stability,” in Elbridge A. Colby and Michael S. Gerson, eds., Strategic Stability: Contending Interpretations (Carlisle, PA: U.S. Army War College Press, 2013), pp. 118-146.
  4. This theme is examined in depth in Robert Ayson, Thomas Schelling and the Nuclear Age: Strategy and Social Science (New York: Frank Cass, 2004).
  5. T.C. Schelling, “The Reciprocal Fear of Surprise Attack,” RAND paper P-1342, April 16, 1958, revised May 28, 1958, p. 1.
  6. On this theme, see, notably: Kerry M. Kartchner, Negotiating START: The Quest for Stability and the Making of the Strategic Arms Reduction Treaty (New Brunswick, NJ: Transaction Publishers, 1991).
  7. Joshua Pollack, “Emerging strategic dilemmas in U.S.-Chinese relations,” Bulletin of the Atomic Scientists, vol. 65, no. 4, July-August 2009, pp. 53-63.
  8. James M. Acton, Silver Bullet? Asking the Right Questions about Conventional Prompt Global Strike (Washington, DC: Carnegie Endowment for International Peace, 2013), pp. 17-21.
  9. China tested its first nuclear device in 1964, but did not deploy ICBMs optimized for targeting the United States until the mid- 1990s. India and Pakistan tested nuclear weapons in 1998. North Korea tested its first nuclear device in 2006; its intercontinental delivery capabilities are still rudimentary, but the testing of more sophisticated ICBMs is widely anticipated.

Advancing Minorities and Women to the PhD in Physics and Astronomy

Keivan G. Stassun

Vanderbilt University and Fisk University, Nashville, Tennessee

Introduction
The under-representation of minorities in the space sciences is an order-of-magnitude problem, and is one of the major challenges facing the nation’s science, technology, engineering, and mathematics (STEM) workforce as a whole.1 Minority-serving institutions are important producers of domestic minority talent in the sciences. Roughly one-third of all STEM baccalaureate degrees earned by African-Americans are earned at Historically Black Colleges and Universities (HBCUs), and the top 15 producers of Black baccalaureates in physics are all HBCUs. Just 20 HBCUs were responsible for producing fully 55% of all Black physics baccalaureates in the U.S. for 1998 to 2007.2 Institutional partnerships with HBCUs are thus a promising avenue for broadening participation in the physical sciences.3 At the same time, recent research on the educational pathways of minority students in STEM disciplines indicates that these students are roughly twice as likely as their non-minority counterparts to seek a master’s degree en route to the doctorate.4 These facts motivate programmatic approaches aimed at deliberately preparing underrepresented minority students for success as they traverse the critical Masters-to-PhD transition.

Here we describe a program developed in partnership between Vanderbilt University, a PhD-granting R-1 university, and Fisk University, a research active HBCU, both in Nashville, Tennessee. The Fisk-Vanderbilt Masters-to-PhD Bridge Program is for students who seek additional coursework or research experience before beginning PhD-level work. Students are not evaluated on the basis of GRE but rather on alternative metrics that are predictive of long-term success. The program provides a continuous path — a bridge — to the PhD that we have found is particularly effective for students whose baccalaureate degrees are from small, minority-serving institutions, and who may for a variety of reasons seek a master’s degree en route to the PhD. The program is flexible and tailored to the goals of each student. Courses are selected to address any gaps in undergraduate preparation, and research experiences are designed to pave the way for PhD-level work in the chosen area of study. While at Fisk, students enjoy regular interaction with Vanderbilt faculty including access to Vanderbilt courses and, of critical importance, thesis research performed under the joint supervision of Vanderbilt and Fisk faculty. In all cases, we deliberately develop research-based mentoring relationships between students and faculty that will foster a successful transition to the PhD.

The Importance of Masters-to-PhD Transitions for Underrepresented Minorities
In the decade between 1990 and 2000, the total number of master’s degree recipients increased by 42%. During this same time period, the number of women earning master’s degrees increased by 56%, African Americans increased by 132%, American Indians by 101%, and Hispanics by 146%. A recent studyvprovides critical new insight into the role of the master’s degree as underrepresented minority students proceed to the doctorate in STEM disciplines. Data from the NSED was used to examine institutional pathways to the doctorate, and transitions from masters’ to doctoral programs by race and gender, for a sample of more than 80,000 PhDs.

As shown in Figure 1, the study identified six primary pathways to the PhD. Statistical analysis reveals that pathways are significantly different for underrepresented minorities (χ2=49.1, df=18, p 0.001). The two major differences are that White/Asian students are more likely to forgo earning the master’s degree altogether (“No MS, BA≠PhD” in Figure 1), and underrepresented minority students are much more likely to earn all three degrees at three different institutions (BS≠MS≠PhD). Underrepresented minority students are thus more likely to use the master’s degree as a stepping-stone toward success at the PhD level. Unfortunately, very often the transition from master’s degree to PhD is one that students must navigate on their own.

The Fisk-Vanderbilt Masters-to-PhD Program
Admission begins with application to the Fisk MA program in physics, which includes undergraduate transcripts, letters of recommendation, and a personal statement. The applicant indicates on the application that they wish to be considered for the Bridge program and submits an additional Bridge program information form.

Officially speaking, admission to the Bridge program does not constitute pre-admission to the Vanderbilt PhD program, nor does it carry with it a formal guarantee of admission to Vanderbilt in the future. We did not want to create the appearance of a “back door” into the PhD program, and we did not want to encourage passivity in the students admitted or in the faculty mentors responsible for preparing them. But this does not mean that the program makes no promises. On the contrary, Bridge students are guaranteed support and mentorship in a number of concrete forms, described below, and receive an explicit commitment that they will get the personalized attention, guidance, and one-on-one mentoring relationships that will allow them to develop — and to demonstrate — their full scientific talent and potential. This philosophy is more than a platitude; the program has been formulated with oversight by the appropriate Deans at both universities, who hold the program’s directors accountable for its success.

Identifying and Evaluating Students with the “Right Stuff”
The continued use of standardized tests — in particular the GRE — as a filter for determining who gets in to graduate school is a major factor in the ongoing, massive underrepresentation of minorities and women in STEM PhD programs.

As shown in Figure 2, GRE scores are not blind to the demographics of test-takers,. Indeed, the correlation of GRE score with gender and ethnicity are among the strongest correlations in the exam (along with socio-economic status). Consequently, adopting a cutoff GRE score (a score of 700 on the quantitative portion is typical in STEM PhD programs) leads to only ~30% of all women in the physical sciences, and only ~5% of all African Americans in the physical sciences, “making the cut” for PhD admissions.

An interview protocol with a scoring rubric designed to measure the seven facets of “grit” demonstrated by Angela Duckworth and others to be strong — and unbiased — predictors of student potential, is a much more robust and fair approach to identifying which students actually have “the right stuff” to succeed to the PhD and beyond.

Facilitating a Successful Transition to the PhD
The vehicle by which successful transitions to the Vanderbilt PhD program are realized is through carefully orchestrated student-faculty mentoring relationships focused on research. We have found that the extent to which a student is successful in developing one-on-one research-based relationships with faculty mentors — mentors who may very well become the student’s PhD advisor — is the single most reliable predictor of the student’s eventual admission into the Vanderbilt PhD program. Faculty mentors not only provide key guidance on course selection and research topics, they also become the student’s most important advocates in the PhD admissions process. The fact is that a student who is well known to the faculty of the admitting department is more likely to have their potential for success evaluated on the basis of direct faculty interaction, and not simply on how the student appears “on paper.”

It is thus the explicit goal of the Bridge program that its students will be well known by the Vanderbilt faculty by the time that they are ready to apply to the Vanderbilt PhD program of their choice. Indeed, fostering individual research-based mentoring relationships between Fisk students and Vanderbilt faculty is at the very heart of the Bridge program, and is the guiding principle for all other programmatic design considerations. To that end, the Bridge program includes the following key elements, requirements, and benefits:

  • Participation in supervised research, at Fisk or Vanderbilt (or both), during at least the second academic year of the program, and participation in supervised research at Vanderbilt (or at an affiliated research site) during at least each summer of the program. Students are required to produce a publication-quality master’s thesis.
  • Assignment of both a Fisk advisor and a Vanderbilt advisor. Joint mentoring allows tracking of student progress and helps to ensure student readiness for PhD-level work.
  • Scheduling of at least two meetings per year with the Bridge program steering committee to review progress and receive guidance, in addition to the day-to-day interactions with primary faculty advisers. This helps keep key personnel abreast of student progress, helps to keep each Bridge student on the PhD program’s “radar screen”, and helps PhD program directors in planning the needs of each year’s incoming PhD class.
  • Requirement of at least B grades in all graduate courses, with at least one of these courses being a core PhD course taken at Vanderbilt. This allows the student to demonstrate competency in a core PhD course, which is essential to demonstrating promise for PhD study. Typically, Bridge students take several core PhD courses at Vanderbilt. Together with a judicious selection of courses taken in fulfillment of the MA degree at Fisk, many Bridge students complete most of the course requirements for the PhD by the time they apply to the Vanderbilt doctoral program.

Underlying Programmatic “Theory”
Recognizing and nurturing unrealized potential in students: In formulating our admissions strategy, we have abandoned the usual mindset of filtering applicants on the basis of proven ability to one of identifying applicants with unrealized potential that can be honed and nurtured. Recognizing that potential takes a number of forms, and often plays out differently for each student. One student’s undergraduate transcript might show a low GPA that, on closer inspection, is the result of a slow start but a clear upward trajectory. Another may have an excellent GPA but missing upper-level courses in the major because they were simply not available at the undergraduate institution. Still another may simply have made a strong positive impression on a faculty recruiter during a poster presentation at a national conference. At the same time, we have formed strong, positive relationships with colleagues at numerous minority-serving institutions. As we get to know these undergraduate programs better, we are able to make more informed evaluations about specific strengths and weaknesses of incoming students. A report studying strategies for building effective partnerships with minority-serving institutions3 found that undergraduate mentors at these institutions take a very active role in advising their students, and will actively steer their students away from graduate programs that they do not trust will nurture their students’ success.

Tracking the second derivative of student performance: We constantly monitor student performance and intervene as soon as we detect an inflection in trajectory. For example, we track the courses that Bridge students enroll in as part of the advising process, and then actively monitor their progress by asking their instructors to promptly notify us at the first signs of concern. One-on-one tutoring is provided, as needed, by advanced graduate students or postdocs, and course-load adjustments are made mid-stream if it is determined that remedial instruction is required before re-enrolling in the course. These mid-stream adjustments typically involve the student taking an incomplete in the course, to be completed in a subsequent semester, and instead either first taking a lower level course or participating in a directed study course custom-designed to fill preparation gaps to ensure eventual success in the required graduate course. At all times, full time enrollment status is maintained to ensure satisfactory progress and eligibility for financial support.

Outcomes
Since its inception in 2004, the Fisk-Vanderbilt Masters-to-PhD Bridge program has attracted nearly 120 students, 85% of them underrepresented minorities, 45% women. Of these, 82% have either already transitioned to the Vanderbilt PhD program, to another PhD program of their choice, or are making satisfactory progress toward that goal. In addition, our students have been awarded the nation’s top graduate research fellowships from NSF (GRF and IGERT) and NASA.

The program’s key design considerations can be summarized as follows:

  • Focus on retention. Direct programmatic efforts toward fostering one-on-one mentoring relationships between students and potential PhD advisers, through enrollment in core PhD courses and through research assistantships in PhD faculty labs. When faculty know a student personally, and can vouch for their performance in coursework and in the laboratory, they can effectively and persuasively advocate for the student based on a holistic evaluation of the student’s ability.
  • Focus on recruitment, not competition. Direct recruitment efforts on truly broadening participation by emphasizing potential instead of already proven ability. Be willing to take risks in admissions, and then erect scaffolds of support to ensure success. Competing with other selective institutions for the few highly sought applicants who stand out in traditional metrics does little to address the needs of the national STEM workforce.
  • Involve key decision-makers in programmatic design and oversight. Faculty who lead graduate admissions must be active stakeholders in the process of matriculating, supporting, and monitoring students. Deans who oversee academic units must commit to work with — and place accountability on — programs that fail to retain students.
  • Stop using the GRE as a filter. Instead, use metrics (such as “grit”) that have been shown to be less biased against minorities and women, and that have been shown to be far more predictive of the types of qualities we (should) actually care about in our graduate students — the promise and potential to succeed to the PhD and beyond.

Keivan Stassun

References

  1. National Science Board, 2003, “Report of the National Science Board Committee on Education and Human Resources Task Force on National Workforce Policies for Science and Engineering,” NSB 03-69
  2. Norman, D. 2009, “Underrepresented Minorities in Astronomy: Higher Education.” A Position Paper submitted to the Astro2010 NAS Decadal Survey in Astronomy and Astrophysics:
  3. https://arxiv. org/ftp/arxiv/papers/0903/0903.4506.pdf
  4. Stassun, K.G. 2003, “Enhancing Diversity in Astronomy: Minority-Serving Institutions and REU Programs: Strategies and Recommended Actions,” Bulletin of the American Astronomical Society, 35, 5, 1448
  5. Lange, S. 2006, “The Role of Masters Degree Transitions on PhD Attainment in STEM Disciplines for Students of Color,” PhD Thesis, University of Washington
  6. Syverson, P. (2003, March). Data Sources. Graduate School Communicator, XXXVI, 5
  7. Miller, C. “Admissions Criteria and Diversity in Graduate School.” American Physical Society. APS News, February 2013, 22.
  8. Miller, C., & Stassun, K.G. 2014, “A Test that Fails”, Nature, 510, 303
  9. Stassun, K.G., Sturm, S., Holley-Bockelmann, K., Burger, A., Ernst, D., & Webb, D. 2011, “The Fisk-Vanderbilt Master’s-to-Ph.D. Bridge Program: Recognizing, Enlisting, and Cultivating Unrealized or Unrecognized Potential in Underrepresented Minority Students”, American Journal of Physics, 79, 374

Showing Students That Science Is Everywhere and for Everyone

Brian Carter, Program Officer, Overdeck Family Foundation

Students spend over 80% of their waking hours outside the classroom; so, why do we expect all a student’s learning to be confined to the less than 20% of the time they spend in school? Think back to your own learning experiences as a child. What inspired you to pursue your current career? If you are like 75% of Nobel Prize winners in the sciences, it was likely something you experienced or did outside of school.

We know that not all students get to have high-quality learning experiences outside of school. In fact, there is an estimated 6,000-hour gap by the time students reach 6th grade between out-of-school learning experiences for low-income students and their middle class peers. The lack of these experiences prevents students from finding and exploring their interests.

It’s these authentic, student-directed out-of-school learning experiences that can have a profound effect on sparking and sustaining a student’s interest in pursuing a career in science, technology, engineering, and math (STEM). A study of students who participated in FIRST Robotics competitions found they were twice as likely to expect to pursue a career in science and technology and more than three times as likely to major in engineering. More than 70% of students who participated in afterschool STEM report positive gains in areas of science interest, science identity, science career interest, and 21st century skills, like critical thinking and perseverance.

For these reasons, Overdeck Family Foundation and the Simons Foundation partnered with DonorsChoose.org to launch the Science Everywhere Innovation Challenge in January 2017. DonorsChoose.org is a site where teachers request the materials and experiences they need for their classrooms and donors give to the projects that inspire them. Through Science Everywhere, the two foundations matched donations to projects that provided hands-on, engaging math and science activities for students to do outside the classroom, with the goal of showing students and teachers that math and science doesn’t stop when the school bell rings.

Since its launch over 900 projects have been funded reaching over 100,000 students. Among the projects supported was Mr. Shafer’s mock crime scene and forensic investigation, which he set up for his students at Skiles Test STEM Elementary School in Indianapolis. At Souderton Charter School Collaborative in Souderton, PA, teacher Jeannine Dunn used this opportunity to launch a Last Chance Repair Club. Thus far, these middle school students have been able to experience the scientific method first hand by working to diagnose and fix what is wrong with broken clocks, calculators, and even a CD player brought in by their fellow classmates.

Overdeck Family Foundation, the Simons Foundation, and DonorsChoose.org are very interested to understand the impact these projects have had on student learning. Thus, each teacher who received funding through Science Everywhere has been invited to participate in an evaluation study being conducted by Prof. Robert Tai at the University of Virginia. It was Prof. Tai’s groundbreaking Sciencepaper in 2006, which found that students who expressed interest in science-related careers by eighth grade were 2-3 times more likely to earn college degrees in STEM disciplines, showing that many students make decisions about their futures before high school and stick to them.In fact, the results showed that even STEM-interested students with weaker standardized math test scores were more likely than their top testing math non-STEM peers to actually get STEM degrees. Prof. Tai has recently developed and validated a new method to assess the impact out-of-school STEM activities have on student engagement in learning.

This evaluation examines types of commonly used learning activities. After an extensive examination of learning activities used in curricula at both national and local levels, Prof. Tai and his colleagues found these seven common types of learning activities: 1) collaborating, 2) competing, 3) discovering, 4) creating/making, 5) performing, 6) caretaking, and 7) teaching.A survey instrument was designed by Prof. Tai and his colleagues to gather data on students' preferences for these seven types of learning activities. The survey is administered twice to the participating students and aims to capture their learning activity preferences before and after program participation. Prof. Tai is using his new methodology to understand the impact the Science Everywhere projects have had on different dimensions of students' learning engagement.

Over the summer, a panel of 12 judges, comprising six national leaders in math and science and six exceptional teachers will select five of the Science Everywhere projects and the teachers who authored these projects will each receive $5,000 prizes. Projects will be evaluated based on emphasis of math and science core concepts, promotion of creativity and hands-on activities outside of school, ease of replication, and demonstration of student learning, as measured by Prof. Tai’s evaluation.

NFL wide receiver Victor Cruz is one of the 12 judges. He founded the Victor Cruz Foundation, which aims to increase the number of underrepresented kids interested in career fields related to STEM while simultaneously promoting positive change in the lives of youth today through innovative educational programs. Victor agreed to be a judge because he believes “math and science learning shouldn’t stop at the classroom door, and these projects will show kids that there’s so much more to explore.” Former NASA astronaut Leland Melvin agreed to judge in order to honor his parents’ legacy, both of whom were educators who inspired him and so many others to reach for the stars. Thus, it was fitting that one of the projects supported through Science Everywhere allowed 1st-grade teacher Josefina Rivera, who teaches at Maria Saucedo Scholastic Academy in Chicago, to obtain sky observation kits, which allowed her students to explore the night sky at with their parents.

While the Science Everywhere Innovation Challenge has ended, DonorsChoose.org has many more hands-on, engaging math and science projects that need your support, over 150 of which involve physics. Go online today and support one that interests you. You can also take inspiration from the more than 900 projects, which were submitted and funded through this challenge, and offer to volunteer at a local school to ensure math and science doesn’t stop just because the school day does.

About Overdeck Family Foundation
Demonstrating a passion and commitment to the future of American education, John and Laura Overdeck established the Overdeck Family Foundation in 2011. The foundation seeks to help all kids achieve their greatest potential by funding compelling, innovative programs and projects that have proven, quantifiable results.

About the Simons Foundation
Established in 1994, the Simons Foundation is a private foundation dedicated to advancing the frontiers of research in mathematics and the basic sciences. An initiative of the Simons Foundation, Science Sandbox supports and collaborates with programs that unlock scientific thinking in everyone, and advance the message that you don’t have to be a scientist to think like one.Science Sandbox is dedicated to inspiring a deeper interest in science among all people, especially those who don’t think of themselves as science enthusiasts.

About DonorsChoose.org
Founded in 2000 by a Bronx history teacher, DonorsChoose.org has raised $500 million for America's classrooms. Teachers come to DonorsChoose.org to request the materials and experiences they need most for their classrooms, and donors give to the projects that inspire them. To date, nearly 2.5 million people and partners have funded projects on the site, reaching 21.6 million students and making DonorsChoose.org the leading platform for supporting U.S. public schools.

Doctoral Pathways: URM and White/Asian
Figure 1: Comparisons between underrepresented minorities (URMs) and White/Asian students, based on different permutations of the educational pathway to the PhD. An equal sign indicates degrees earned from the same institution. The fourth and sixth comparisons from the left show the “traditional” paths to the PhD, in which the student earns the bachelors degree from institution A, and either receives both the masters degree and the PhD from institution B or else forgoes the masters degree entirely. The fifth comparison from the left is shown the case for earning the bachelors degree at institution A, a “terminal” masters degree at institution B, and PhD from institution C. Minorities are much more likely to take this latter path than non-minorities. Based on analysis of 80,739 PhDs earned in science and engineering fields, 1998 to 2002. Adapted from Reference 3. Copyright 2009, K. Stassun.
GRE Quantitative Score, US Citizens
Figure 2: GRE Quantitative score distributions from 2006-2007 for US citizens whose intended graduate majors were in STEM (this is the most recent publically available data). The tick is the median, and the top and bottom of each marker represents the 75th and 25th percentiles within each group; labels indicate the total number of test takers. The left axis is labeled with the old GRE scale and percentile; the right axis shows the corresponding scaled scores for the new exam. The blue horizontal line represents a typical “minimum acceptable” GRE score for admission to physics PhD programs. Adapted from Reference 7. Copyright 2013 APS.

Three Not Such Well-known Aspects of Solar Power and Climate Change

Wallace Manheimer
Retired from NRL

This work was not sponsored by any agency pubic or private.

1. Introduction
The climate situation grabs more and more media attention these days. As this is written, there is a climate march occurring. This paper examines three not such well-known facts. First, one cannot turn on one’s TV these days without seeing that solar power (i.e. solar photovoltaic, solar thermal, wind and biofuel) is now cheaper and is rapidly overtaking fossil fuel as a power source for our civilization. This definitely is not true. Solar power is nowhere near a point where it makes an important contribution to the world’s power budget, and at this point at least, it is reasonable to surmise that it never will be. Second, one often reads that 97% of climate scientists agree that excess CO2 in the atmosphere is causing destructive climate change and that ‘the science is settled’. Neither is true. Third, the excess CO2 in the atmosphere has not done any significant environmental harm up to now, and extrapolating present data, it is unlikely to do so any time soon.

2. The Role of Solar Power
There are two important issues in the climate debate. The first is whether we need fossil fuel or can get along without it; i.e. can solar power move in and play the role any time soon? The second is whether the use of fossil fuel is causing or will cause a major environmental problem. A large part of the debate focuses only on the second issue and ignores the first.

Fossil fuel still produces about 85% of the world’s power. There is no denying this. Nevertheless, below is the Figure from Richard C. J. Somerville and Susan Joy Hassol’s article (Physics Today, October 2011) where they give various scenarios for ending the use of fossil fuel:

Clearly Somerville and Hassol insist that the use of fossil fuel must be greatly reduced in 20 years and must end soon thereafter. Other organizations such as the Sierra Club and 350.org, as well as Al Gore insist on ending the use of fossil fuel even sooner.

How will we get the power we need? This is not a minor detail. Modern civilization depends critically on fossil fuel to power it. Without an abundant, inexpensive energy source, modern civilization simply vanishes. But they cannot be concerned with such trivia. They are too busy saving the planet; powering it without fossil fuel is someone else’s problem, it is not their department! It reminds one of the rhyme from the old Tom Lehrer song about Werner von Braun:

Once rockets go up, who cares where they come down? That’s not my department, says Werner von Braun!

To reiterate, doing what Somerville and Hassol, Al Gore, the Sierra Club… insist upon would end civilization unless another power supply, available at about the same quantity and price can replace fossil fuel. But as we will see shortly, solar power is nowhere near ready to do this.

We have to use fossil fuel responsibly, as cleanly and conservatively as possible, but use it we must. In a world with 7 or more billion people, it is directly or indirectly responsible for our prosperity, health, modern high tech medicine, longevity, education, transportation, the possibility of large cities, large scale international trade, a clean environment….. Those like 350.org and other like-minded groups and individuals wrongly think that civilization can thrive without it. But the truth is that without it, it is back to abject poverty for all but the privileged few, as has been humanity’s fate for almost its entire existence, i.e when fossil fuel was not used.

Hence almost the entire moral argument is on the side of using fossil fuel, especially for the developing world. For instance in the July 2016 Issue of Physics Today, in an article: Physics, fracking, fuel, and the future, by Michael Marder, Tadeusz Patzek and Scott W. Tinker, presented a graph, demonstrating the unbreakable link between fossil fuel use and prosperity, shown in Figure 2, along with its caption.

Roughly a billion people in the US, Europe, Russia and Japan each use about 6 kW (i.e. 6 terawatts total), leaving about 1 kW for each of the other 6 billion people on the planet. Notice that according to the chart, the average Chinese uses about 25% of the power of the average American. In 2000, this figure was about 10%. In 2009 I was at a scientific meeting, where a high ranking member of the Chinese Academy of Science remarked on this, and said that they would not rest until their per capita power use is about the same as ours. They know that there is an unbreakable link between power and prosperity.

What is important is that fossil fuel cannot and will not be eliminated until another power source, becomes available at about the same quantity and price. The Chinese, Indians, Brazilians, Mexicans, Indonesians, Nigerians, … understand this unbreakable link between fossil fuel and prosperity, no not just prosperity, human civilization; even if we do not. They are sick of poverty, and who are we to blame them. Who are we to condemn them for escaping poverty the only way anyone knows how to do so; namely by using fossil fuels.

To illustrate how unlikely it is that renewable solar power can play any role in the world energy budget anytime soon, and the fact that the less developed world will not heed our advice to move away from fossil fuel, consider the Figs (3 and 4), taken from the BP statistical review of world energy 2013. Clearly renewables have a long, long way to go before they can supplant fossil fuel. Also it is the less developed parts of the world that are increasing the use of fossil fuel. The use of fossil fuel by the more developed parts of the world has leveled off.

A major effort has been made to support renewable solar power. It has been heavily subsidized for at least a quarter of a century. The American Federal support for climate change research over the past 20 years is shown in Figure (5) below. It was ~$12B in 2014. The average over this period was ~$7B per year, meaning that ~$140B has been spent on climate change research over the past 20 years! For this we got the amount of solar power affecting the world economy as shown in Fig (3)

Some have argued that fossil fuel receives larger federal subsidies than renewables. While every industry, including fossil fuel, gets a variety of tax breaks (eg. business expenses, depreciation, …), fossil fuel receives far less in direct subsidies than solar power. The American Energy Information Agency publishes data on federal support for various energy options. Their chart is in Figure (6). Renewable power is subsidized about $15B (a bit more than Fig. (5) indicates), and fossil fuel about $3B. However since renewable power only produces ~1% of the world’s power, it gets about 500 times as much subsidy per energy unit produced. Furthermore, fossil fuel pays taxes, as anyone driving up to a gas station to fill his or her tank knows.

Reported Federal Climate Change Funding by Category, 1993-2014

An enormous effort has been made to bring up solar power to a point where it can contribute to the world economy; clearly it has failed at this point.

3. The ‘97%’ scientific consensus
Another question is whether 97% of climate scientists really believe that fossil fuel is causing, or will cause great environmental damage. One interesting piece of data is the Oregon Petition. This was an effort led by Frederick Seitz (deceased), a former president of the National Academy of Science. It is a petition disputing the effect of humans on global warming. It garnered 32,000 signatures by a large cross section of the scientific community. Here is a link to it (http://www.petitionproject.org). As would be needed to justify the 97% figure, are there really a million scientists who have signed an opposing petition?

A recent op-ed in the Wall Street Journal by Steven Koonin, April 17, 2017, sheds further light. In the op-ed he proposed that a red and blue team of scientist separately evaluate the issue. He goes on to state:

“The public is largely unaware of the intense debates within climate science. At a recent national laboratory meeting, I observed more than 100 active government and university researchers challenge one another as they strove to separate human impacts from the climate’s natural variability. At issue were not nuances but fundamental aspects of our understanding, such as the apparent — and unexpected — slowing of global sea level rise over the past two decades.”

So much for the ‘97%’ and “the science is settled”.

4. CO2 has had little or no effect on the environment so far.
The next issue is whether CO2 is now doing harm to the environment. Before considering this, let us consider a couple of simple, obvious facts about CO2. It is not a pollutant, but rather a vital nutrient for plants. Over many of the hundreds of millions of years when plant and animal life was evolving, CO2 levels were much higher than today. While humans did not exist during this time, our primate ancestors did fine. Every carbon atom in our bodies, and in the food we eat; had its origin in the carbon in plants and decayed organic matter in the soil; which in turn had its origin in the CO2 in the atmosphere and in the oceans. There is even evidence that the added CO2 in the atmosphere since the dawn of the industrial age has aided agriculture and has helped in greening the planet (http://co2coalition.org). Without atmospheric CO2, life on earth would not be possible.

But is the added amount since the start of the industrial age doing any harm? This author has examined assertions of a variety of important people, President Obama, Hillary Clinton, Marcia McNutt (the editor of Science Magazine), Al Gore…. All have made specific claims of imminent doom (eg. more frequent and intense storms, loss of agricultural productivity, rising sea levels…) if we continue to burn fossil fuel at current or projected rates. But how can the average person, or even the average scientist independently check this out? Is the only choice to read thousands of journal articles in dusty, obscure journals, journals to which the average person has no access?

Fortunately there is another way. A great deal of data is available on line with simple a Google or Google images search. This is something anyone can do, anywhere, anytime. There is no need for any expert to interpret the data; it speaks for itself. However there are possible pitfalls to a Google type search, which one must be cognizant of. After all, one could undoubtedly find a miracle cure for cancer by doing a Google search. But generally this is no big deal; it is easy to avoid this sort of trap. I am quite certain there are no significant errors in the data I presented.

This work is summarized in

Wallace Manheimer, Original Sin, Prophets, Witches, Communists, Preschool Sex Abuse and Climate Change, International Journal of Advanced Research, June 2016, page 280

http://www.journalijar.com/article/9945/original-sin,-prophets,-witches,-communists,-preschool-sex-abuse-and-climate-change/ which is available open access.

In a nutshell, the conclusion of this data search, which anyone can easily check up on, is that claims of impending doom are, for the most part either wildly exaggerated, or else are continuations of processes which have been occurring nearly unchanged for centuries. Not a single one of the specific assertions of gloom and doom mentioned can stand up to serious scrutiny. To this author’s mind, it is amazing that the mainstream media does not do such a careful check up on the data it is spoon fed by alarmists. Any competent science reporter for any major media outlet could easily do what I did and almost certainly come to the same conclusion. My guess is that the media’s inability and unwillingness to do such a check will ultimately harm its reputation for decades to come.

Very briefly summarizing, the article shows that NOAA’s ground based measurements of temperature had recently changed so as to find recent warming in contrast to their earlier measurements which showed a nearly 20 year hiatus in warming. I believe NOAA has seriously damaged its credibility by publishing such changes, changes that please their political bosses; and then refusing to publicize their new methodology. My article also shows that NASA’s space based temperature measurement give rather different results, results that show less warming, but a temperature which oscillates in time with a variety of frequencies. The article shows that there has been no increase in hurricanes, tornados, wild fires, droughts or loss of agricultural productivity. Furthermore it shows that the retreat of glaciers is a 200-year phenomenon, one showing no increase as atmospheric CO2 increases. Also it points out that NASA’s most recent satellite measurements show ice sheets in Antarctica thickening, not melting (that is the difference between the ice thickening in some places, and the melting in others is positive). The article points out the German experience, which indicates that at least up to now, solar power is considerably more expensive than the alternatives and using it does not necessarily reduce CO2 input to the atmosphere. Regarding numerical simulations of climate, it shows that they have greatly overestimated the heating. As I mentioned these are conclusions anyone can check out anywhere, anytime. The data speaks for itself, it needs no expert to interpret it. I believe that is the main strength of my paper.

As a single illustration, the rate of sea level rise can be obtained by simple doing a search on Google images of ‘Graph of sea level rise’. Many graphs will pop up, all about the same; they all show that the seas have been rising at a rate of about 20-25 cm/century for decades, with no particular recent increase. In Figure (7) is one such graph based on IPCC data.

Some might think my paper has an odd sounding name. The reason is that it also makes the case that those I call alarmists are rather like old testament prophets, accusing mankind of sin, and courting severe punishment, a sin and punishment which only they can discern. They say that we must either change our ways or be destroyed. However unlike their biblical predecessors, these modern day ‘prophets’ have no direct pipeline to God. The paper compares the climate change alarmists to other such ‘prophets’ the title suggests, ‘prophets’ who caused general or localized panics and created only harm and chaos in their wake. Obviously this is not a scientific argument, but any reader can judge for himself whether the comparison makes any sense. To me it does.

5. Conclusion
Since solar power is so far from becoming an important player in the world energy budget, and since the scientific community is far from united on whether excess CO2 in the atmosphere will have a significant environmental effect, and since it almost certainly has had no such effect so far, the question is should we be on a breakneck pace to reduce fossil fuel use in the hope that solar power can replace it? This author’s answer is no. The cost to civilization would be astronomical if solar power should fail, as it has so far.

A Most Improbable Journey: A Big History of Our Planet and Ourselves

By Walter Alvarez, W.W. Norton & Company, Inc., New York, NY, 236 pages, $26.95, ISBN 978-0-393-29269-5

This volume is written as Big History which means not simply human history, but also the story of the connections of humans to their changing environment. It describes the series of unusual events that have led to the development of the human race. In his introduction, Alvarez argues that Big History intends to recount a very long and complex version of history focusing on science as well as historical accounts produced by human beings. It is focused on understanding the entire past by taking a panoramic view that necessarily involves more science than humanism. In addition to a prologue which describes Big History as a discipline and relates a fascinating and personal discovery of evidence of the giant meteoric impact at Chicxulub by the author and his colleagues, and an Epilogue which focuses on how unlikely it is that the human race has developed as it has, the book is divided into 4 large sections focused on the Cosmos, Earth, Life and Humanity.

In addition to the personal account of doing geology in the introduction, my favorite section in this volume was the longest, the four chapters focusing on Earth. In it Alvarez focuses lovingly on his area of expertise, Earth science. The chapters stress the way the planet produced materials that humans use and an environment in which they thrive. For example, Chapter 3, focuses on silicon, its use by humans as well as the geology of its origin. Chapter 4 then explores the development of continents and oceans and provides a useful introduction to plate tectonics. Chapter 5 concerns the rise and fall of mountain ranges, and Chapter 6 uses a cross-continental train trip on Amtrak to examine rivers and their importance to humans. In other words, this section is a complete and unconventional introduction to modern geology.

The section on the Cosmos focuses on modern astronomy and cosmology and does a good job of pointing out how fast the field is evolving. The third section, which deals with the origins of Life, begins with the first living cells and traces how human beings evolved. The discussion of DNA evidence for the development of humans from simpler life forms was particularly interesting. Finally, the section on humanity describes how humans spread across Earth and developed fire, stone tools, and eventually metals. The book takes you from the Big Bang to the origins of civilizations and written history and ends with man’s adventures into space.

Throughout the book, the author stresses the element of chance in all this. In his own words (page 105) Alvarez states that two of the main themes of the book are “how geologic history has influenced human history, and how easily things could have turned out very differently.” He has clearly illustrated the role of chance in all phases of human development until today, but understandably makes no attempt to predict the future.

Alvarez has clearly written this book for a general audience although the end notes provide technical citations for almost everything. It is especially appropriate for young people interested in science since it is not only an easy read, but presents exciting problems from many areas of science. I loaned my copy to a sophomore from Brown University, Shaunald Shende who is studying mechanical engineering and applied mathematics, whom I met on a plane. Even though he had traveled through Dallas from Rapid City and made a connection across the airport in 10 minutes, he still enjoyed the book enough to pronounce it “good airplane reading” and said he had learned some science reading it. I enjoyed this volume for pleasure reading and learned from it. I recommend it to any physicist interested in Big History or anyone who wants to give a bright college or high school student a science-oriented birthday present that will be greatly enjoyed.

Ruth H. Howes
714 Agua Fria Street
Santa Fe, NM 87501

Nuclear Proliferation and Terrorism in the Post-9/11 World

By David Hafemeister (Springer International Pub. AG., Switzerland, 2016), 434 + xxiv pages, $45, ISBN 978-3-319-25365-7.

The proliferation of nuclear weapons is one of the most serious problems facing the world today. As the number of countries possessing nuclear weapons increases so does the probability of a nuclear exchange caused either deliberately or by accident. Also an increase in the number of nuclear weapons and the amount of weapons grade materials in the world increases the possibility that terrorist groups will obtain and use nuclear weapons. Hafemeister’s book is an extensive discussion of nuclear weapons and nuclear proliferation. It covers all aspects of the subject from Rutherford’s discovery of the nucleus to the current situation. It is intended as a textbook for an upper division undergraduate course. Hafemeister is particularly well qualified to discuss this subject, having taught such a course for over 40 years and having served on numerous governmental and other committees dealing with arms control matters.

Hafemeister considers three main issues: the major-power arms race, proliferation of nuclear weapons, and terrorism especially after 9/11. He begins with a history of the atomic age. Nuclear research following Rutherford’s discovery of the nucleus led to the discovery of nuclear fission. The fear that Germany would develop fission bombs led to the Manhattan project and the development of the first atomic bombs by the United States. Fission bombs were soon followed by hydrogen fusion bombs. Within a fairly short time other countries, notably the UK, USSR, China, and France, also developed nuclear weapons. Delivery of nuclear weapons by airplane was followed by land based and sea launched ballistic missiles and cruise missiles. Attempts at ballistic missile defense have been unsuccessful and seem mostly to have been destabilizing. Finally the arms race between the US and the USSR has been partly ended by various arms control treaties along with methods of verification. At present, programs of arms control have been slowed by the cooling of relations between the US and Russia and by attempts of China to catch up with the US and Russia. Given the present political and diplomatic situation, Hafemeister finds it difficult to see how Russia, China, and the US can collaborate to reduce nuclear deployment.

Proliferation of nuclear weapons usually refers to the development of nuclear weapons and delivery systems by countries other than the US, UK, Russia, China, and France. Other countries having nuclear weapons are India, Pakistan, Israel, and North Korea. Countries that have had nuclear weapons but have given them up include South Africa, Belarus, Kazakhstan, and Ukraine. A number of countries have started weapons development programs but terminated them: Algeria, Argentina, Brazil, Germany, Iraq, Japan, Libya, South Korea, Sweden, Switzerland, Syria, and Taiwan. Iran currently does not have nuclear weapons but appears to have had a development program. This program was put on hold by the recent deal with Iran. Several treaties attempt to control proliferation including the nuclear nonproliferation treaty and the comprehensive test ban treaty. To reduce proliferation, the US purchased large quantities of weapons grade uranium and plutonium from states of the Former Soviet Union. Control of proliferation is made difficult because of the use of nuclear reactors for power. It is possible to produce nuclear weapons from reactor grade plutonium. To reduce this possibility, the US limits the reprocessing of spent fuel originating in the US. And the International Atomic Energy Agency maintains safeguards over both reactor and weapons grade plutonium.

One of the major concerns related to nuclear weapons is that terrorists might get their hands on them. Terrorists might use stolen nuclear weapons or improvised devices made with stolen nuclear materials. Terrorists would not need to have a fully functional nuclear bomb. For their purposes it would be enough to have a “dirty bomb” that would spread radioactivity even without a full nuclear explosion. So far there has been no evidence of terrorists trying to get nuclear weapons. This may be because of safeguards, because of the difficulty of making nuclear devices, or because terrorists can produce the same amount of destruction more easily with chemical or biological weapons.

Hafemeister’s book is an excellent introductionto the history and the current state of nuclear weapons. However, it is not an easy book to read. There is so much material covering so many topics that it requires a very careful reading to get the maximum understanding of the subject. Unfortunately there are two serious flaws in the book. First, there is no index. This is bad enough for any non-fiction book but especially for a textbook. This book particularly needs references to all of the many acronyms. I often found it difficult to remember what the acronyms stand for and could not look back to the original definitions. There is a glossary, but it does not list all of the acronyms and I did not find it particularly useful. The second flaw is the large number of errors in the text. These are mostly minor typographical errors but some are more serious. There are a number of places where figures, tables, or definitions are not clear and are likely to cause serious confusion to the students.

Kenneth S. Mendelson
Professor emeritus of physics
Marquette University

Global CO2 Emissions
Figure 1: Various scenarios for ending the use of fossil fuel according to Somerville and Hassol. In all cases the use of fossil fuel must end in about 20 years.
Fraction hydrocarbon from coal
Figure 2: The plot from Marder et al and its caption: Figure 1. The correlation between hydrocarbon-based power consumption and economic output for most countries on Earth. A power-law fit finds that annual GDP per person is G = $10 500 (C/kW)0.64, where C is hydrocarbon-based energy consumption per second per person. The tight power-law relationship indicates that economic prosperity is not currently feasible without consumption of hydrocarbon fuels. The power law is reminiscent of scaling laws in biology; 15 the flow of petroleum through economies resembles the flow of blood in mammals. On average, the hydrocarbon power consumed in the US is 8 kW per person, the same as 80 incandescent 100 W bulbs burning continuously. If the US were to rely only on its currently available renewables — biomass cogeneration, wood, hydropower, geothermal, wind, passive solar, and photovoltaics — power consumption would drop to four bulbs per person; eliminating hydropower and biofuels would reduce the number to one or two. The reduction would entail such a change in lifestyle as to make the US unrecognizable. 16 (Data source: Central Intelligence Agency, World Factbook, 2015; DOE/Energy Information Administration, 2015.)
Citation: Phys. Today 69, 7, 46 (2016); http://dx.doi.org/10.1063/PT.3.3236
World energy consumption by fuel
Figure 3: Clearly it is extremely unlikely that solar power can replace fossil fuel in 20 years as Somerville and Hassol, and many others insist. Simply ending fossil fuel without a replacement would impoverish the world and set civilization back centuries.https://ourfiniteworld.com/2015/06/23/bp-data-suggests-we-are-reaching-peak-energy-demand
World energy consumption by part of world
Figure 4: It is the less developed parts of the world that are increasing energy use as they struggle to end their persistent poverty.
https://ourfiniteworld.com/2015/06/23/bp-data-suggests-we-are-reaching-peak-energy-demand
figure 5

Figure 5: Since 1993 the Federal government has spent ~$140B on climate change funding. http://www.gao.gov/key_issues/climate_change_funding_management/issue_summary

figure 6

Figure 6: It is solar, wind, biofuels that received the lion’s share of subsidies. https://www.eia.gov/todayinenergy/detail.php?id=20352

figure 7

Figure 7: a plot of ocean rise. The rise has been at a rate of 20-25 cm/century since about 1920, with no recent increase. https://www.ipcc.ch/publications_and_data/ar4/wg1/en/figure-5-13.html


These contributions have not been peer-refereed. They represent solely the view(s) of the author(s) and not necessarily the view of APS.

Physics and Society is the non-peer-reviewed quarterly newsletter of the Forum on Physics and Society, a division of the American Physical Society. It presents letters, commentary, book reviews and articles on the relations of physics and the physics community to government and society. It also carries news of the Forum and provides a medium for Forum members to exchange ideas. Opinions expressed are those of the authors alone and do not necessarily reflect the views of the APS or of the Forum. Contributed articles (up to 2500 words), letters (500 words), commentary (1000 words), reviews (1000 words) and brief news articles are welcome. Send them to the relevant editor by e-mail (preferred) or regular mail.

Editor: Oriol T. Valls. Assistant Editor: Laura Berzak Hopkins. Reviews Editor: Art Hobson. Electronic Media Editor: Tabitha Colter. Editorial Board: Maury GoodmanRichard Wiener, Jeremiah Williams. Layout at APS: Leanne Poteet. Website for APS: webmaster@aps.org.