Archived Newsletters

P&S Type Too Small?

July 20, 1999

Greetings. the Newsletter print is so small that I am finding it very difficult to read it. If I am the only person who is having this problem, then, you may remove my name from the mailing list (and the Forum Membership) and thus save time and money. On the other hand, if there are other older folks like me facing this problem, you might consider a bigger font size for our Newsletter - as in Forum on Education's Newsletter.

If you like, you might put the above as a "Letter to the Editor" and get some feedback from the other members.

Thanking you.

Sincerely

M.J. Ponnambalam
The University of the West Indies
Department of Physics
Mona, Kingston 7'
Jamaica, W.I.

Editor’s Comment
Paper, like the rest of the environment, has constraints, and priorities must be set - hopefully with the aid of our readers. The Forum has a limited budget for P&S which keeps us to four sheets (15 to 15.5 pages of text) except for the elections issue which usually has five sheets. In the past we allowed roughly 1000words per page (15000 words per issue) which gave us an open, easily legible, appearance but limited the number of items in any one issue. Because of the many pressing issues which I think are important to our readers, I have tried to cram more material into each issue, recognizing that this would lead to a smaller typeface in our paper issues. (However, each issue is also available at our Web site.) I assumed that I would soon hear from our readers if this approach was objectionable. So far, this is the only letter I have received commenting upon my reduction of the type size. As a result of this letter, I have attempted to make this issue more like the previous open style, to see what further comments would ensue. I would appreciate further guidance on this matter from the readers.

Physicist’s Responsibility for the CTBT Issue

The Comprehensive Test Ban Treaty (CTBT) involves physics in important ways: the extent to which the nation's stockpile of nuclear weapons can be maintained in the absence of nuclear explosive tests, and the degree to which adherence to the treaty by other states can be monitored by national and international technical means. Bearing these factors in mind, the APS Council passed a resolution in support of ratification; and the APS President, Jerome Friedman, joined by a score of Nobel Laureates in physics, wrote a letter to the same effect to the Senate. Given these facts, and the Senate's vote against ratification, American physicists have a responsibility to become engaged in this issue. They owe this to their fellow-citizens, because statements by Senators on both sides of the issue, and by editors and correspondents of leading newspaper, displayed a remarkable degree of ignorance about the technical dimensions. There are various avenues to becoming engaged provided by APS, as well as by several independent organizations easily reached on the web.

Kurt Gottfried
Physics Department, Cornell University
and Union for Concerned Scientists

Editors Comment
The previous issue of P&S, in an article by G. Holton, contained a long list of organization via which physicists have, and can, carry out their scientific and social responsibilities. A little searching of the Web will provide access to these groups and many more, on "all sides" of the pressing issues of the day. We urge you to take advantage of this easy access and do your duty.

Quantum Encription and Arms Control

I just finished reading the October issue of Physics & Society, devoted primarily to Arms Control. It was an excellent issue. It also raised an important question for me, which is the reason for this e-mail.

Ambassador Paul Robinson's paper, in particular, flagged what I view as an important element of arms control: the confidence building and information exchange made possible through public-key encryption. This allows the host country to read what is being sent out, to determine that only agreed upon data is being sent, while at the same time preventing tampering with the data. It seems to underlie just about all current arms control agreements.

My concern is this: present research, sponsored not surprisingly in large part by NSA, on quantum computing and quantum encryption appears to defeat RSA encryption based on the difficulty of factoring large numbers, and replace it with completely secure (and un-clonable) quantum encryption. Will this development (quantum computing is the one to be concerned about,and will reasonably take at least 30 years to happen) undermine future arms control? Is this something FPS or POPA should be looking at?

I think this is an issue that at least deserves to be discussed within the FPS community.

Comments and/or reactions, or suggestions are welcome. Thanks,

Dr. Peter J. Reynolds
Office of Naval Research
Atomic and Molecular Physics ProgramPhysical Sciences Division, ONR
331703-696-4205 (voice),703-696-6887 (fax),603-462-6385 (e-fax) <-- preferred

Nuclear Espionage and the Role of Scientists

The very existence of nuclear weapons is the biggest nuclear secret we possess. This secret has been known since the atomic bombing of Hiroshima on August 6, 1945 and has been lost in the contentious debate over nuclear espionage at U.S. Government laboratories. Once this fact is known, any competent physicist can build such weapons given the proper amount of technical, financial, and industrial support. With enough time and effort, scientists can independently discover the other so-called secrets that comprise nuclear weaponry. Dr. Edward Teller, the father of the American hydrogen bomb, pointed out in the 1970 Report of the Defense Science Board Task Force on Secrecy, "It is unlikely that classified information will remain secure for periods as long as five years, and it is more reasonable to assume that it will become known to others in periods as short as one year." This report adds: "In spite of very elaborate and costly measures taken independently by the US and the USSR to preserve technical secrecy, neither the United Kingdom nor China was long delayed in developing hydrogen weapons."

Espionage occurs for a number of reasons. It helps to dispel fear and doubt about the intentions of both friends and enemies. It can serve to abate insecurities about one's actual position, for example, the myth of the missile gap between the US and Russia in the late 1950s and early 1960s. Spying saves money on prodigious research and development costs, thus it is necessary for developing countries, like China. The rewards from spying will ensure the future of espionage, despite our best efforts at curtailing it. Faced with the threat of continued nuclear espionage, how should we respond?

While many, like Dr. Teller, believe that the appropriate response to nuclear espionage is to accelerate our work on weapons, this response is futile and costly. Efforts to outpace others in weapons development become ever more difficult. Instead, we should heed his colleague, Dr. Hans Bethe, director of the theoretical division at Los Alamos during the Manhattan Project. In a 1997 letter to President Clinton, Dr. Bethe stated, "Since any new types of weapons would, in time, spread to others and present a threat to us, it is logical for us not to pioneer further in this field." In other words, we should not continue weapons design work because terrorists and other potential enemies of the United States would eventually acquire this information and employ it in weapons against us.

In a 1995 Atomic Scientists Appeal to Colleagues, Dr. Bethe, "call[ed] on all scientists in all countries to cease and desist from work creating, developing, improving, and manufacturing further nuclear weapons - and, for that matter, other weapons of potential mass destruction, such as chemical and biological weapons." In a similar vein, this year, scientists in Japan have initiated a pledge for Japanese scientists and engineers to sign, promising to not participate in work on weapons of mass destruction. They intend to expand their pledge movement to other countries. While American scientists and engineers, committed to nonproliferation and disarmament, should fully cooperate with this movement, they should not wait for this pledge to reach them but commence their own pledge to dedicate to non-weapons work.

Charles D. Ferguson, PhD, senior research analyst
Federation of American Scientists
307 Massachusetts Avenue, NE, Washington, DC 20002
Tel: (202) 675-1007, Fax: (202) 675-1010

Articles

The American public seems concerned with the potential environmental impact of nuclear power, unaware of the "carbon problem". As indicated by these articles, physicists are concerned with both. Politicians’ prime concern often seems to be just getting elected, no matter what the present or future environmental or security problems may be. The guiding premise of our Forum has been that physicists have an obligation to help the public force their politicians to deal effectively with these problems. They can do so as individuals, via "non-partisan" educational groups such as the Forum, or via issue- oriented "pressure groups" (such as FAS, UCS, etc.) But they should be active!

The Science and Politics of Climate

Freeman J. Dyson

Talk given at American Physical Society Centennial Meeting Atlanta, Georgia, March 25, 1999

Responding to the Joseph A. Burton Award Given by the APS Forum on Physics and Society

Three agencies of the US government have serious programs of climate research, NASA, NOAA and the Department of Energy. I shall talk mostly about the Department of Energy because that is my home territory. The Department of Energy program is the smallest of the three. Anybody who had been primarily involved with the NASA or NOAA programs could tell similar stories about them. My involvement began at the Oak Ridge National Laboratory in 1972. Alvin Weinberg, who was director of Oak Ridge for many years, started a program of climate studies there. He was running a major nuclear power development program, with a large effort devoted to studying the environmental and public health problems of nuclear power. He decided to broaden the environmental studies to include effects of power-plants burning fossil fuels. Weinberg is an interesting character in many ways. He is himself a strong pro-nuke. He helped to build the first nuclear reactors at Oak Ridge and spent most of his life promoting nuclear power. But he likes to listen to opposing views. He collected at Oak Ridge a bunch of brilliant people, including anti-nukes as well as pro-nukes, to study the environmental problems associated with all kinds of energy. One of the anti-nukes at Oak Ridge was Claire Nader, the sister of Ralph Nader. Weinberg liked her and always listened to what she had to say. Allan Poole was another unusual character in the group around Weinberg. Poole had been for some years a Buddhist monk in Thailand. He was an expert on tropical forests. Another member of the group was Jack Gibbons, who later became head of the Office of Technology Assessment and science advisor to President Clinton.

The practical advice that Alvin Weinberg gave to the Department of Energy was to increase the funding of field measurements, physical measurements in the atmosphere and biological measurements on the ground. The purpose of measurements in the atmosphere was to test the climate models with real data. The purpose of measurements on the ground was to explore the non- climatic effects of carbon dioxide on farms and forests. The department did not pay much attention to his advice. The lion's share of the budget for carbon dioxide research continued to be spent on computer models. The amount of money spent on local observations is small, but the money has been well spent.

Several successful programs of observation have been started in recent years. One of them is a Department of Energy program called ARM, Atmospheric Radiation Measurements. ARM's activities are mainly concentrated at a single permanent site in Oklahoma, where systematic observations of radiation fluxes in the atmosphere are made with instruments on the ground and on airplanes flying at various altitudes. Measurements are made all the year round in a variety of weather conditions. As a result, we have a data-base of radiation fluxes as a function of wave-length, angle and altitude, in clear sky and in cloud and between clouds. One of the most important measurements is made by two airplanes flying one above the other at different altitudes. Each airplane measures the fluxes of radiation coming up from below and down from above. The difference measures the local absorption of radiation by the atmosphere as a function of wave-length. The measured absorption of sunlight turns out to be substantially larger than expected. The expected absorption was derived partly from theory and partly from space-based measurements. The discrepancy is still unexplained. If it turns out that the anomalous absorption measured by ARM is real, this will mean that all the global climate models are using wrong numbers for absorption.

The ARM program also has active sites in the south-west Pacific and on the north shore of Alaska. The south-west Pacific site made important contributions to the international TOGA program studying El Nino. The south-west Pacific is the place where sea surface temperatures are highest, and El Nino begins with a massive movement of hot surface water from west to east. If we consider the global climate to be a heat-engine, the south-west Pacific is the hot end of the engine and the north shore of Alaska is the cold end. The ARM sites were chosen so that we can study the hot and cold ends of the engine, with the Oklahoma site somewhere in the middle. The original plan for ARM had two additional sites, one in tropical forest and one in desert, but the funding for more sites never materialized.

Another successful program of local observation is measuring directly the fluxes of carbon dioxide moving between the atmosphere and the biosphere. This is done by putting instruments on towers above the local trees or other vegetation. Accurate anemometers (wind-speed meters) measure the vertical motion of the air, while infrared gas analyzers measure the carbon dioxide content at the same place and the same time. Both measurements are made instantaneously, four times a second, so that you are measuring the carbon dioxide carried by each local eddy in the atmosphere as it moves up or down. If, as usually happens in daytime in the summer, the trees are absorbing carbon dioxide, each packet of air moving down carries more carbon dioxide and each packet moving up carries less. You can derive the flux of carbon dioxide going into the trees by multiplying the vertical speed by the carbon dioxide abundance and averaging over time. This is called the eddy covariance method of measuring fluxes. It is remarkably accurate, because it turns out that the vertical speed and the carbon dioxide abundance are almost a hundred percent correlated. When you measure at night or in winter, you find that the flux is going the other way. Trees are then not photosynthesizing but giving off carbon dioxide by respiration. The soil also gives off substantial fluxes of carbon dioxide, mostly from respiration of microbes and fungi. The eddy covariance method does not distinguish between vegetation and soil. It measures the total flux leaving or entering the atmosphere.

For many years the eddy covariance measurements were made in only three places in the world, one over a temperate forest in Massachusetts, one over a tropical forest in Brazil, and one over a boreal forest in Canada. Steven Wofsy at Harvard was the pioneer who got the whole thing started at the site in Massachusetts, (Wofsy et al., 1993). The results of the first measurements were startling. The Massachusetts forest was absorbing carbon at a rate of 3.7 tons per hectare per year, far more than was expected for a mature forest. If you supposed that all the temperate forests of the world were absorbing carbon at this rate, the result would be an absorption of 5 gigatons of carbon per year, which happens to be almost exactly the amount of missing carbon that disappears from the atmosphere. The Amazon forest shows an absorption of one ton per hectare per year, not so large but still more than was expected, (Grace et al., 1995). The Canadian forest is emitting carbon at a rate of 0.3 tons per hectare per year, probably mostly from soil respiring more as the arctic climate grows warmer, (Goulden et al., 1998). If these numbers are also representative of forests all over the world, the tropical forests and the boreal forests roughly cancel each other out, the tropical forests absorbing and the boreal forests emitting about a gigaton each. The total for all forests would then be 5 gigatons of absorption.

Finally, during the last few years, a serious program of eddy covariance measurements has been started, with instrumented sites in many countries around the world, to see whether the results observed at the first three sites are really representative of forests in general. A consortium called Ameriflux has been organized with 24 sites in north America, and many other sites are operating in Europe and Asia. Results so far seem to confirm the earlier measurements. One temperate forest site in Italy measures 5 tons per hectare per year absorption, and one boreal forest site in Sweden measures half a ton per hectare emission. Within a few years, we will know for sure whether the temperate forests are really the main sink of the missing carbon. And the same technique of eddy covariance can be used to monitor the carbon fluxes over agricultural croplands, wetlands and grasslands. It will give us the knowledge required, so that we can use the tools of land management intelligently to regulate the carbon in the atmosphere. Whether we manage the land wisely or mismanage it foolishly, we shall at least know what good or harm we are doing to the atmosphere.

Besides ARM and Ameriflux, there is a third highly successful program of local measurements called ATOC, Acoustic Thermometry of Ocean Climate, the brain-child of Walter Munk at the Scripps Institution of Oceanography. ATOC uses low-frequency underwater sound to measure ocean temperatures, (ATOC Consortium, 1998). A signal is transmitted from a source on top of a seamount at a depth of 900 meters near San Francisco, and received at six receivers in deep water around the north Pacific. The times of arrival of signals at the receivers are accurately measured. Since the speed of propagation depends on temperature, average temperatures of the water along the propagation paths can be deduced. The main obstacle that Walter Munk had to overcome to get the ATOC project started was the opposition of environmental activists. This is a long and sad story which I don't have time to tell. The activists decided that Munk was an evil character and that his acoustic transmissions would endanger the whales in the ocean by interfering with their social communications. They harassed him with lawsuits which delayed the project for several years. Munk tried in vain to convince them that he also cares about the whales and is determined not to do them any unintentional harm. In the end the project was allowed to go forward, with less than half of the small budget spent on monitoring the ocean and more than half spent on monitoring the whales. No evidence was found that any whale ever paid any attention to the transmissions. But the activists are continuing their opposition to the project and its future is still in doubt.

During the two years that the ATOC system has been operating, seasonal variations of temperature have been observed, giving important new information about energy transport in the ocean. If measurements are continued for ten years and extended to other oceans, it should be possible to separate a steady increase of temperature due to global warming from fluctuations due to processes like El Nino that vary from year to year. Since the ocean is the major reservoir of heat for the entire climate system, a measurement of ocean temperature is the most reliable indicator of global warming. We may hope that the activists will one day admit that an understanding of climatic change is as essential to the preservation of wildlife as it is to the progress of science.

It is time now to wind up this talk and summarize what we have learned. There is good news and bad news. The good news is that we are at last putting serious effort and serious money into local observations. Local observations are laborious and slow, but they are essential if we are ever to have an accurate picture of climate. The bad news is that the climate models on which so much effort is expended are unreliable. The models are unreliable because they still use fudge-factors rather than physics to represent processes occurring on scales smaller than the grid-size. Besides the general prevalence of fudge-factors, the climate models have other more specific defects that make them unreliable. First, with one exception, they do not predict the existence of El Nino. Since El Nino is a major and important feature of the observed climate, any model that fails to predict it is clearly deficient. Second, the models fail to predict the marine stratus clouds that often cover large areas of ocean. Marine stratus clouds have a large effect on climate in the oceans and in coastal regions on their eastern margins. Third, the climate models do not take into account the anomalous absorption of radiation revealed by the ARM measurements. This is not a small error. If the ARM measurements are correct, the error in the atmospheric absorption of sunlight calculated by the climate models is about 28 watts per square meter, averaged over the whole earth, day and night, summer and winter. The entire effect of doubling the present abundance of carbon dioxide is calculated to be about 4 watts per square meter. So the error in the models is much larger than the global warming effect that the models are supposed to predict. Until the ARM measurements were done, the error was not detected, because it was compensated by fudge-factors that forced the models to agree with the existing climate. Other equally large errors may still be hiding in the models, concealed by other fudge-factors. Until the fudge-factors are eliminated and the computer programs are solidly based on local observations and on the laws of physics, we have no good reason to believe the predictions of the models.

The bad news does not mean that climate models are worthless. Syukuro Manabe, who ran the climate modeling program at the Geophysical Fluid Dynamics Laboratory at Princeton, always used to say that the purpose of his models was not to predict climate but to understand it. Climate models are still, as Manabe said, essential tools for understanding climate. They are not yet adequate tools for predicting climate. If we persevere patiently with observing the real world and improving the models, the time will come when we are able both to understand and to predict. Until then, we must continue to warn the politicians and the public, don't believe the numbers just because they come out of a supercomputer.

References:

ATOC Consortium, 1998. Ocean Climate Change: Comparison of Acoustic Tomography, Satellite Altimetry and Modeling, Science, 281, 1327-1332.

Goulden, M. L. et al., 1998. Sensitivity of Boreal Forest Carbon Balance to Soil Thaw, Science, 279, 214-217.

Grace, J. et al., 1995. Carbon Dioxide Uptake by an Undisturbed Tropical Rain Forest in Southwest Amazonia, 1992 to 1993, Science, 270, 778-780.

Wofsy, S. C. et al., 1993. Net Exchange of $CO2$ in a Mid-Latitude Forest, Science, 260, 1314-1417.

Freeman J. Dyson

Institute for Advanced Study, Princeton, New Jersey

Nuclear Power and the Large Environment

David Bodansky

Talk given at American Physical Society Centennial Meeting, Atlanta, Georgia, March 25, 1999

1. Introduction
The development of nuclear energy has come to a near halt in the United States and in much of the rest of the world. The construction of new U.S. reactors has ended and although there has been a rise in nuclear electricity generation in the past decade, due to better performance of existing reactors, a future decline appears inevitable as individual reactors reach the end of their economically useful lives.
An obstacle to nuclear power is the publicly perceived environmental risk. During this development hiatus, it is useful to step back and take a look at nuclear-related risks in a broad perspective. For this purpose, we categorize these risks as follows:

  • Confined risks. These are risks that can be quantitatively analyzed, and for which the likelihood and scale of possible damage can be made relatively small.
  • Open-ended risks. These are risks that cannot be well quantified by present analyses, but which involve major dangers on a global scale.

As discussed below, public concern has focussed on risks in the confined category, particularly reactor safety and waste disposal. This has diverted attention from more threatening, open-ended risks of nuclear weapons proliferation, global climate change, and potential scarcity of energy in a world of growing population. The rationale for this categorization and the connection between nuclear power and these open-ended risks are discussed below.

2. Confined risks.
a. Nuclear reactor accidents.

The belief that reactor accident risks are small is based on detailed analyses of reactor design and performance, and is supported by the past safety record of nuclear reactors, excluding the accident at Chernobyl in 1986. Defects in the design and operation of the Chernobyl reactor were so egregious that the Chernobyl experience has virtually no implications for present reactors outside the former Soviet Union. Chernobyl is a reminder, however, of the need for careful, error-resistant design if there is to be a large expansion of nuclear power in many countries.
At the end of 1998 there had been over 8000 reactor-years of operation outside the former Soviet Union, including about 2350 in the United States. Only one accident, that at Three Mile Island, has marred an otherwise excellent safety record. Even at TMI, although the reactor core was severely damaged, there was very little release of radioactivity to the environment outside the reactor containment. Subsequently, U.S. reactors have been retrofitted to achieve improved safety and, with improved equipment and greater attention to careful procedures, their operation has become steadily more reliable.

A next generation of reactors can be even safer, either through a series of relatively small evolutionary steps that build directly upon past experience or through more radical changes that place greater reliance on passive safety features--such as cooling water systems that are directly triggered by pressure changes (not electrical signals) and that rely on gravity (not pumps). It would in fact be remarkable if the accumulated past experience, both good and bad, would not improve the next generation.

b. Nuclear waste disposal

The second dominant public concern is over nuclear wastes. Current plans are to dispose of spent fuel directly, without reprocessing, keeping it in solid form. Confinement of the spent fuel is predicated on its small volume, the ruggedness of the planned containers, the slowness of water movement to and from a site such as Yucca Mountain, and the continual decrease in the inventory of radionuclides through radioactive decay.

Innumerable studies have been made to determine the degree to which the radionuclides will remain confined. One way to judge the risks is to examine these studies as well as independent reviews. An alternate perspective on the scale of the problem can be gained by considering the protective standards that have been proposed for Yucca Mountain.

Proposed standards were put forth in preliminary form by the EPA in 1985. These set limits on the release of individual radionuclides from the repository, such that the attributable total cancer fatalities over 10,000 years would total less than 1000. This target was thought to be achievable when the only pathways considered for the movement of radionuclides from the repository were by water. However, the development of the site was put in jeopardy when it was later recognized that escaping 14C could reach the "accessible environment" relatively quickly in the form of gaseous carbon dioxide. A release over several centuries of the entire 14C inventory at Yucca Mountain would increase the worldwide atmospheric concentration of 14C by about 0.1%, corresponding to an annual average dose of about 0.001 mrem per year for hundreds of years. The resulting collective dose to 10 billion people could be sufficient to lead to more than 1000 calculated deaths.

It is startling that 14C might have been the show-stopper for Yucca Mountain. It appeared that this could occur, until Congress took the authority to set Yucca Mountain standards away from the EPA pending future recommendations from a panel to be established by the National Academy of Sciences (NAS). The panel issued its Report in 1995. It recommended that the period of concern extend to up to one million years and that the key criterion be the average risk to members of a "critical group" (probably numbering less than 100), representing the individuals at highest risk from potentially contaminated drinking water. It was recommended that the calculated average risk of fatal cancer be limited to 10-6 or 10-5 per person per year. According to the estimates now used by federal agencies to relate dose to risk, this range corresponds to between 2 mrem/year and 20 mrem/year.

Taking the NAS panel recommendations into consideration, but not fully accepting them, the EPA in August 1999 proposed a standard whose essential stipulation is that for the next 10,000 years the dose to the maximally exposed future individual is not to exceed 15 mrem per year. This may be compared to the dose of roughly 300 mrem per year now received by the average person in the United States from natural radiation, including indoor radon.

Attention to future dangers at the levels represented by any of these three standards can be contrasted to our neglect of much more serious future problems, to say nothing of the manner in which we accept larger tolls today from accidents, pollution, and violent natural events. While we have responsibilities to future generations, the focus should be on avoiding potential disasters, not on guarding people thousands of years hence from insults that are small compared to those that are routine today.

c. Fuel cycle risks

Risks from accidents in the remainder of the fuel cycle, which includes mining, fuel production and waste transportation have not attracted as much attention as those for reactor accidents and waste disposal, in part because they manifestly fall into the confined-risk category. Thus, the September 1999 accident at the Tokaimura fuel preparation facility resulted in the exposure of many of the workers, including two cases of possibly fatal exposures. It involved an inexcusable level of ignorance and carelessness and may prove a serious setback to nuclear power in Japan and elsewhere. However, the effects were at a level of harm that is otherwise barely noticed in a world that is accustomed to coal mine accidents, oil rig accidents, and gas explosions. The degree of attention given the accident is a measure of the uniquely strict demands placed on the nuclear industry.

3. Open-ended risks
a. Nuclear weapons proliferation.

The first of the open-ended risks to be considered is that of nuclear weapons proliferation. A commercial nuclear power program might increase this threat in two ways:

  • A country that opts for nuclear weapons will have a head start if it has the people, facilities, and equipment gained from using nuclear power to generate electricity. This concern can explain the U.S. opposition to Russian efforts to help Iran build two nuclear power reactors.
  • A terrorist group might attempt the theft of plutonium from the civilian fuel cycle. Without reprocessing, however, the spent fuel is so highly radioactive that it would be very difficult for any sub-national group to extract the plutonium even if the theft could be accomplished.

To date, the potential case of Iran aside, commercial nuclear power has played little if any role in nuclear weapons proliferation. The long-recognized nuclear weapons states---the United States, the Soviet Union, the United Kingdom, France, and China---each had nuclear weapons before they had electricity from nuclear power. India's weapons program was initially based on plutonium from research reactors and Pakistan's on enriched uranium. The three other countries that currently have nuclear weapons, or are most suspected of recently attempting to gain them, have no civilian nuclear power whatsoever: Israel, Iraq, and North Korea.

On the other side of the coin, the threat of future wars may be diminished if the world is less critically dependent on oil. Competition over oil resources was an important factor in Japan's entry into World War II and in the U.S. military response to Iraq’s invasion of Kuwait. Nuclear energy can contribute to reducing the urgency of such competition, albeit without eliminating it. A more direct hope lies in stringent control and monitoring of nuclear programs, such as attempted by the International Atomic Energy Agency. The United States' voice in the planning of future reactors and fuel cycles and in the shaping of the international nuclear regulatory regime is likely to be stronger if the United States remains a leading player in the development of civilian nuclear power.

In any event, the relinquishment of nuclear power by the United States would not inhibit potential proliferation unless we succeeded in stimulating a broad international taboo against all things nuclear. A comprehensive nuclear taboo is highly unlikely, given the heavy dependence of France, Japan, and others on nuclear power, the importance of radionuclides in medical procedures, and the wide diffusion of nuclear knowledge —— to say nothing of the unwillingness of the nuclear weapons states to abandon their own nuclear weapons.

b. Global climate change.
The prospect of global climate change arises largely from the increase in the atmospheric concentration of carbon dioxide that is caused by the combustion of fossil fuels. While the extent of the eventual damage is in dispute, there are authoritative predictions of adverse effects impacting many millions of people due to changes in temperature, rainfall, and sea level. Most governments profess to take these dangers seriously, as do most atmospheric scientists. Under the Kyoto agreements, the United States committed itself to bring carbon dioxide emissions in the year 2010 to a level that is 7% lower than the 1990 level. Given the 11% increase from 1990 to 1997, this will be a very difficult target to achieve.

Nuclear power is not the only means for reducing CO2 emissions. Conservation can reduce energy use, and renewable energy or fusion could in principle replace fossil fuels. However, the practicality of the necessary enormous expansion of the most promising forms of renewable energy, namely wind and photovoltaic power, has not been firmly established. Additionally, we cannot anticipate the full range of resulting impacts. Fusion is even more speculative, as is the possibility of large-scale carbon sequestration. If restraining the growth of CO2 in the atmosphere warrants a high priority, it important to take advantage of the contribution that nuclear power can make---a contribution clearly illustrated by French reliance upon nuclear power.

c. Global population growth and energy limits.

The third of the open-ended risks to be considered is the problem of providing sufficient energy for a world population that is growing in numbers and in economic aspirations. The world population was 2.5 billion in 1950, has risen to about 6 billion in 1999, and seems headed to some 10 billion in the next century. This growth will progress in the face of eventual shortages of oil, later of gas, and still later of coal.

The broad problem of resource limitations and rising population is sometimes couched in terms of the "carrying capacity" of the Earth or, alternatively, as the question posed by the title of the 1995 book by Joel Cohen, How Many People Can the Earth Support? As summarized in a broad review by Cohen, recent estimates of this number range from under 2 billion to well over 20 billion, centering around a value of 10 billion.

The limits on world population include material constraints as well as constraints based on ecological, aesthetic or philosophical considerations. Perhaps because they are the easiest to put in "objective terms," most of the stated rationales for a given carrying capacity are based on material constraints, especially on food supply which in turn depends upon arable land area, energy, and water.

Carrying capacity estimates made directly in terms of energy, in papers by David Pimentel et al. and by Gretchen Daily et al., are particularly interesting in the present context as illustrations of the possible implications of a restricted energy supply. Each group concludes that an acceptable sustainable long-term limit to global population is under 2 billion, a much lower limit than given in most other estimates. They both envisage a world in which solar energy is the only sustainable energy source. For example, in the Pimentel paper the authors conclude that a maximum of 35 quad of primary solar energy could be captured each year in the United States which, at one-half the present average per capita U.S. energy consumption rate, would suffice for a population of 200 million. For the world as a whole, the total available energy would be about 200 quads, which Pimentel et al. conclude means that "1 to 2 billion people could be supported living in relative prosperity."

One can quarrel with the details of this argument, including the maximum assumed for solar power, but it dramatically illustrates the magnitude of the stakes, and the centrality of energy considerations.

4. Conclusions.
If a serious discussion of the role of nuclear power in the nation's and world's energy future is to resume, it should focus on the crucial issues. Of course, it is important to maintain the excellent safety record of nuclear reactors, to avoid further Tokaimuras, and to develop secure nuclear waste repositories. But here --considering probabilities and magnitudes together -- the dangers are of a considerably smaller magnitude than those from nuclear weapons, from climate change, and from a mismatch between world population and energy supply.

The most dramatic of the dangers are those from nuclear weapons. However, as discussed above, the implications for nuclear power are ambiguous. For the other major areas, the picture is much clearer. Nuclear power can help to lessen the severity of predicted climate changes and can help ease the energy pressures that will arise as fossil fuel supplies shrink and world population grows. Given the seriousness of the possible consequences of a failure to address these matters effectively, it is an imprudent gamble to let nuclear power atrophy in the hopes that conservation and renewable energy, supplemented perhaps by fusion, will suffice.

It is therefore important to strengthen the foundations upon which a nuclear expansion can be based, so that the expansion can proceed in an orderly manner — if and when it is recognized as necessary. Towards this end, the federal government should increase support for academic and industrial research on nuclear reactors and on the nuclear fuel cycle, adopt reasonable standards for waste disposal at Yucca Mountain, and encourage the construction of prototypes of the next generation of reactors for use here and abroad. Little of this can be done without a change in public attitudes towards nuclear power. Such a change might be forcibly stimulated by a crisis in energy supply. It could also occur if a maverick environmental movement were to take hold, driven by the conclusion that the risks of using nuclear power are less than those of trying to get by without it.

David Bodansky
Department of Physics, Box 351560
University of Washington
Seattle, WA 98195

  1. Joel E. Cohen, How Many People Can the Earth Support? (W.W. Norton & Co, New York, 1995).
  2. David Pimentel et al, "Natural Resources and Optimum Human
    Population," Population and the Environment, A Journal of
    Interdisciplinary Studies 15, no. 5 (May 1994), 347-69.
  3. Gretchen C. Daily, Ann H. Ehrlich and Paul R. Ehrlich, "Optimum
    Human Population Size," Population and the Environment, A
    Journal of Interdisciplinary Studies 15, no. 6 (July 1994),
    469-475.

Keeping Up With a Single Monthly Email

One of the best sources of information about issues in Washington that are important to the physics community is the American Institute of Physics "FYI". These 1-2 page summaries of developments in science policy cover everything from legislation affecting scientists to the federal budget, and they are extremely useful in constructing the News section of this newsletter. One can subscribe at FYI and get the summaries via e-mail. For those who don't wish to get an e-mail every couple of days (there are about 180 per year), FYI is offering a new service called "FYI This Month". Subscribers to this service will get a single monthly e-mail message, summarizing the developments covered in more depth in "FYI". To subscribe, send the message:

add fyithismonth

to listserv@aip.org. There is no charge for this service.

The Budget - Finally

On November 22nd, seven weeks late, Congress finally passed the budget and adjourned. Although the magnitude of the science budgets had pretty much been agreed on many weeks earlier, the overall appropriations were held up by the usual fall political wrangling. In spite of the fact that the government has dipped into the Social Security surplus for decades, both parties insisted that they weren't going to do so this years. To achieve this, many accounting gimmicks were used. For example, the NIH budget was increased by 15% to $17.9 billion. However, $4 billion of this spending can't be spent until September 29, 2000, and thus much of the actual spending will be put off until FY 2001. The difficulties of administering this requirement (and the resulting effects on continuing grants) is an exercise for the reader. In addition, the Republican congressional leadership pushed for an across-the-board spending cut, and finally settled for a 0.38% cut. Department heads will be able to decide how to apply this cut within their overall budgets. The savings from this cut is $1 billion dollars (it is recognized that the statistical uncertainty in all budget projections is at least $20 billion dollars). The numbers below do not include this particular cut.

Individual agencies fared as follows: (many more details can be found in FYI)

National Science Foundation---The NSF budget increases by 6.5% to $3.91 billion, with increases for Research and Related Activites of 7.1%, for Education and Human Resources up 5.3% and for Major Research Equipment up 5.6%. This is one of the largest increases in recent years, and is above the administration request. NSF Director Colwell praised the conferees remarking that they "demonstrated extraordinary leadership and a clear understanding of the importance of investing in science and engineering". A detailed breakdown can be found in FYI #149 (www.aip.org/enews/fyi/1999/fyi99.149.cfm)

NASA----The NASA appropriation of $13.65 billion is above the original House, Senate and Administration requests, but slightly less than the FY1999 appropriation. Space science, Life and Microgravity Sciences and Application, and Academic Programs all get more than the Administration requested, while Earth Sciences and the Space Station receive less than requested. The detailed breakdown can be found in FYI#150 (www.aip.org/enews/fyi/1999/fyi99.150.cfm).

DOE---Although most of DOE's physics related programs saw small increases, the Spallation Neutron Source appropriation was slightly more than half of the $214 million requested. The Council of the American Physical Society, in November, issued a formal statement urging full funding of the SNS.

The Office of Science received a 4.3% increase. The biggest winner was fusion energy, with an increase of over 10%. High energy physics got a 1.6% boost, nuclear physics was up 5%, while Basic Energy Sciences was cut 3.2%.

Public Access to Research Data

Last year, a law was slipped into the Omnibus Appropriations Act which "requires Federal awarding agencies to ensure that all data produced under an award will be made available to the public through the procedures established under the Freedom of Information Act". This touched off a firestorm within the scientific community. As written, it appeared to require that anybody working under an NSF or DOE grant, for example, would have to make public all of their data, even if it had not been analyzed or peer-reviewed. It was up to the Office of Management and Budget (OMB) to implement this law. They received an extraordinary 12,000 comments on how to do so. The regulations have now been issued, and many (but not all) of the concerns of the scientific community have been satisfactorily addressed.

OMB commented "OMB recognizes the importance of ensuring that the revised Circular (Law) does not interfere with the traditional scientific process....it needs to ensure that the changes do not interfere with cutting-edge science and the benefits that such science provides to the American people. During the revision process, many commenters expressed concern that the statute would compel Federally-funded researchers to work in a 'fishbowl' in which they would be required to reveal the results of their research and their research methods, prematurely.....Accordingly, in light of this traditional scientific process, we have not construed the statute as requiring scientists to make research data publicly available while the research is still ongoing."

They then proposed their specific regulations. Only published research findings will be subject to a Freedom of Information Act request. "Research data is defined as the recorded factual material commonly accepted in the scientific community as necessary to validate research findings, but not any of the following: preliminary analyses, drafts of scientific papers, plans for future research, peer reviews or communications with colleagues. This recorded material excludes physical objects (e.g. lab samples). Research data do not include trade secrets, commercial information, personnel and medical information......Published is defined as either when (a) research findings are published in a peer-reviewed scientific or technical journal or (b) a federal agency publicly and officially cites the research findings in support of an agency action that has the force and effect of law."

The above does alleviate many of the concerns of the scientific community. It remains to be seen whether some of the other fears materialize (such as polluting companies paralyzing environmental researches with a large number of Freedom of Information Act requests). The readers of this newsletter who were among the many who responded are to be commended, as in the American Physical Society, which played an active role in discussions with the OMB.

An Expensive Metric Problem

For decades, scientists have warned that America's stubborn refusal to embrace the metric system would be very costly, but few expected it to destroy a spacecraft. In October, however, the Mars Climate Observer (MCO) entered Mars orbit too low, went behind the planet and was never heard from again. The problem was caused by a confusion over units.

The MCO has thrusters to make minor corrections to its orbit. The effects of sunlight hitting the single, off-center solar panel nudges it off course, and the thrusters are needed to make the corrections. The JPL team would tell the Lockheed-Martin team how the trajectory needed correcting, and Lockheed-Martin would tell the navigators how much force was applied by the thrusters. Alas, Lockheed-Martin used units of pounds, and JPL assumed that they were in newtons. Thus, JPL navigators concluded that MCO was closer to its planned trajectory than it actually was. The small shifts couldn't be detected because the shifts tended to be in the plane perpendicular to the line of sight, and thus difficult to see.

NASA and JPL have used metric for decades, and the use of metric units for MCO is explicitly spelled out in the agreement between JPL and Lockheed-Martin. Unfortunately, some people in the propulsion industry have continued to use English units.

The only bright side of this fiasco is that tens of thousands of physics teachers around the country told their students about this snafu, and some lessons have undoubtedly been learned.

The Comprehensive Test-Ban Treaty Defeated

It was one of the darkest days in the history of the arms control movement. On October 12th, the Senate defeated the Comprehensive Test Ban Treaty (CTBT) by a vote of 48-51 (67 votes were needed). How did this happen?

Two and a half years ago, the APS adopted a statement on the CTBT. It read (in part): "On September 10, 1996 the United Nations overwhelmingly approved the CTBT, a treaty ending all nuclear testing, of any yield, at any location, for all time. The United States, all other declared nuclear weapons states, and a growing majority of the world's nations have now signed that treaty. Although the date at which the CTBT will enter into force is not yet certain, the treaty is of extraordinary importance to the United States and to the future of all humankind.

"The CTBT, the culmination of over 40 years of effort, ends the qualitative arms race among the nuclear states and is central to future efforts to halt the further spread of nuclear weapons. The promise to negotiate and put into force a CTBT was an essential pre-condition to achieving an indefinite extension of the Non-Proliferation Treaty in May 1995.......it is appropriate and imperative that the United States ratify the CTBT at the earliest possible date. The Council (APS) notes that detailed, fully informed technical studies have concluded continued nuclear testing is not required to retain confidence in the safety and reliability of the remaining nuclear weapons in the U.S. stockpile, provided science and technology programs necessary for stockpile stewardship are maintained. This conclusion is also supported by both the senior civilian and military officials responsible for U.S. national security"

When the treaty was submitted to the Senate, it went to the Senate Foreign Relations Committee. The Chairman, Sen. Jesse Helms (R-NC), an ardent foe of all arms control pacts, bottled it up and refused to permit hearings. The Democrats in the Senate urged him and the Republican leader, Sen. Lott, to hold hearings and schedule a ratification vote. There are a number of moderate Republican senators with a great deal of expertise on foreign policy (such as Sens. Warner, Lugar and Domenici), and it was believed that they would be likely, after extensive hearings, to support the treaty and make the required 2/3 majority achievable. But Senator Helms refused to budge. Then came other foreign relations issues, and finally the impeachment trial, and the CTBT sat in committee.

In September, Senator Helms, apparently certain that he had commitments from at least 34 senators to vote against the treaty, announced that hearing would be held and a vote quickly scheduled. Senator Lott, after over a year of insisting that the Senate should not rush to judgment on such an important matter, announced just such a rush. On October 1st, it was announced that a vote would be scheduled in 12 days. There were to be just two days of hearings. The AGU and Seismological Society, on October 6th, issued a joint statement declaring that the treaty's proposed monitoring system can be relied upon to detect cheating; 32 American Nobel prize winners in physics signed a letter to the Senate stating that "fully informed technical studies have concluded that continued nuclear testing is not required to retain confidence in the safety, reliability and performance of US nuclear weapons". Yet none of the Nobelists were asked to testify. The strongest treaty supporter to testify, Sid Drell of SLAC, testified before a virtually empty chamber at the end of a long day.

Treaty supporters immediately flocked to Washington, seeking moderate "swing senators", and found that there weren't any. Senator Warner announced (prior to hearings) his opposition to the treaty, Senator Lugar refused to meet with scientists and announced his opposition, and it was clear that the result was already determined.

The White House then tried to delay the vote. Most senators, including the moderate Republicans, knew that voting down the treaty would be a terrible blow to American prestige and leadership in arms control, and that a delay would be far preferable. However, the rules of the debate required unanimous consent to delay the vote, various parliamentary procedures failed, and the treaty was defeated on October 12th.

What next? The CTBT could be brought up again, but it would be necessary for 17 Senators to change their minds. It is highly unlikely that the turnover in the next election (or two, or three) will be sufficient. One possibility was suggested in a recent Op-Ed in the Washington Times by Arnold Kanter and Brent Scowcroft. They noted: "However one judges the merits of the CTBT, no one can deny that both the treaty's defeat in the Senate and the process by which that result was reached have done grave damage....The President and the Congress all share a full measure of responsibility for a debacle which reflected the triumph of partisan wrangling over responsible debate about the national interest. But it would only further compound the harm already done if we were to become preoccupied with CTBT post-mortems....The issues themselves, however, will not simply go away and time alone will not repair the damage. (Indeed, we now face the worst of both worlds: the continuation of a unlateral U.S. moratorium on our own testing without any of the constraints on others which the CTBT would impose). What is needed is an initiative that picks up the pieces and re-establishes U.S. credentials as leader of the community of nations."

Kanter and Scowcroft then propose: "CTBT supporters have argued that the treaty would make a vital contribution to slowing the spread of nuclear weapons. CTBT opponents are skeptical that the proliferators of greatest concern to us would ever ratify the treaty....they are also concerned that we do not know yet whether we can maintain the long term confidence we need to have in our nuclear deterrent without testing, and that it may prove to be too easy to cheat on a nuclear test ban....There is a straightforward way both to provide an opportunity to see whether the promised benefits of the treaty can be realized, and to assess whether the concerns expressed by CTBT opponents are well-founded. It is to renegotiate the CTBT--which is now of unlimited duration---for the SOLE purpose of limiting its initial terms to a fixed period (for example, five years) with the option for renewal for additional fixed periods. This one change would allow time to determine whether the stockpile stewardship program, with its reliance on computer simulations and indirect experiments will prove sufficient.....At the end of its initial term, we would be in a better position to determine whether the CTBT has been effective, whether some of its terms need to be changed...there is good precedent for limiting the duration of the CTBT. The 1970 nuclear Non-Proliferation Treaty was initially subject to review and renewal every five years until 1995... This proposal has many pros and cons, but the simple fact is that there will likely be no CTBT without some changes, and this change might be enough to sway a number of Senators."

Whatever the eventual outcome, the major candidates for President next year are staking out positions on the treaty, and it might make arms control a significant campaign issue.

Kansas School Board - A Giant Leap Backwards

The scientific community was shocked and outraged last August when the Kansas Board of Education voted 6-4 to eliminate evolution, the big bang theory, all discussion of geologic time scales, radioactive dating and anything that hinted at a Universe older than about 10,000 years from its state education standards and assessments.

These standards are supposed to be guidelines, based on the national guidelines put out by the National Academy of Sciences, for Kansas' public schoolteachers. Although they aren't mandatory, they will form the basis for statewide achievement tests starting in the spring of 2001, and teachers will generally "teach to the tests". The tests will then no longer contain reference to evolutionary changes between species, the age of the Earth, the big bang theory, etc. Originally, a 27-member committee of teachers and scientists wrote a draft of the science standards, which were published in April; scientists who have looked at this draft generally find it to be an excellent standard for science education. However, the Board of Education replaced this draft with an alternative draft written by a fundamentalist group.

There was an immediate outcry following the vote. The Republican Governor, Bill Graves, said that the vote was "a terrible, tragic, embarrassing solution to a problem that did not exist". Six university presidents in Kansas warned that "it will set Kansas back a century and give hard-to-find science teachers no choice but to pursue other career fields". National organizations also got involved. The American Geophysical Union quickly put out a statement opposing the action and calling for scientists to get more involved politically. Several software companies let it be known that they were removing Kansas from their short lists for new centers.

The governor and legislators began talking about a constitutional amendment to bring the Board of Education back under the control of the legislature (it is now elected), although that will obviously take some time. But the Board remained firm, and announced that the new standards would go into effect next fall.

In November, the Council of the American Physical Society issued the following statement:

"The American Physical Society views with grave concern the recent Kansas State Board of Education decision to remove references to evolution and the Big Bang from its State Education Standards and Assessments. The decision to modify its previous draft of these standards is a giant step backward and should sound an alarm for every parent, teacher and student in the United States. On the even of the new millennium, at a time when our nation's welfare increasingly depends on science and technology, it has never been more important for all Americans to understand the basic ideas of modern science.

Biological and physical evolution are central to the modern scientific conception of the Universe. There is overwhelming geological and physical evidence that the Earth and Universe are billions of years old and have developed substantially since their origins. Evolution is also a foundation upon which virtually all modern biology rests.

This unfortunate decision will deprive many Kansas students of the opportunity to learn some of the central concepts of modern science."

In addition, the American Association for the Advancement of Science (AAAS) issued a statement on October 15th, which can be found at www.aaas.org and many other national organizations have endorsed their statement.

So, what is next? The revised standards (as do many state standards nationwide) refer heavily to the science standards publications of the National Research Council, the AAAS and the National Science Teachers Association. These organizations have all formally denied the Kansas board permission to use parts of their publication in the new draft. This means that they cannot be implemented until after the Kansas board has removed all of this material, which will be costly and time-consuming. This could delay implementation of the new standards by up to a year.

In the meantime, next year Kansas will conduct the most heavily watched School Board election in history. 4 of the 6 Board members who voted for the new standards will be up for reelection, and opponents of the School Board decision will mount strong challenges to these incumbents. A recent poll by the Kansas City Star showed that only 32% of Kansans support the School Board decision, and 52% opppose it (in addition, 81% said that they thought dinosaurs lived millions of years ago). When asked whether they were likely to vote next year, the opponents of the decision were much more likely to say that they will be voting. Three years ago, a similar action by the New Mexico school board resulted in a flurry of activity, several candidates running against the creationists on the Board (and defeating them), and in October the new Board voted 14-1 to adopt a curriculum which includes evolution and explicitly prohibits creationism from the science curriculum. Similarly, in Kansas, a growing group of educators, parents, scientists and students have formed the "Kansas Citizens for Science" to plan an education campaign to reverse the decision. The Website for this new organization is at www.kcfs.org and they have many plans for brochures, outside speakers, mailings targeted to the appropriate school districts, etc, and they encourage all interested parties to join.

AIP/APS Congressional Fellowships

(From FYI#157) As the country approaches a new millennium, the need for scientists who can contribute technical knowledge to the lawmaking process has never been higher. The AIP/APS Congressional Science Fellowships provide a mechanism for making a unique, personal contribution by working as a staffer for a Member of Congress or congressional committee.

The federal government funds about 30% of the nation's R&D, and almost 60% of basica research. Lawmakers are rarely schooled in science and technology, yet their actions influence R&D in major ways, as discussed in these news items. Members of Congress frequently rely on their staffs for scientific and technological know-how in addressing these issues, but those staffers are often no more well-versed than their bosses. Providing this much-needed expertise is the purpose of the Congressional Science Fellowship programs run by the American Institute of Physics and the APS. The Fellowships enable qualified scientists to spend a year on Capitol Hill, learning about the legislative process while applying their knowledge to science-related policy matters.

Yet Fellowship applications have fallen in recent years; in fact, APS is not sponsoring a Fellow for the 1999-2000 term. AIP and APS need good candidates who want to serve their government by analyzing and contributing to national science policy. The two programs are now accepting applications for the 2000-2001 Fellowships.
While some Science Fellows choose to stay in the policy arena, others return to industry or academia to share what they've learned about lawmaking with scientific colleagues. As Rep. Sherwood Boehlert (R-NY) commented, the science community "in essence doesn't know diddly about shaping public policy". He strongly urged scientists to participate in programs like the Fellowship to learn how Capitol Hill works so they can "help shape policy in the right way".

Those interested in applying should have a PhD in physics or a closely related field. Other requirements include U.S. citizenship and membership in APS or AIP at the time of application. While a Fellow must have the scientific qualifications to be a credible representative of the science community on Capitol Hill, he or she should also have demonstrated an interest in broader societal concerns and the application of science to their solution.

The application deadline in January 15, 2000. Further information of the programs and how to apply can be found on the AIP site at www.aip.org/pubinfo/ or the APS site at www.aps.org/public_affairs/fellow.cfm

Leonard M. Rieser Research Fellowship

The Educational Foundation for Nuclear Science, publisher of the Bulletin of the Atomic Scientists, is pleased to announce the establishment of the Leonard M. Rieser Research Fellowship. The Fellowship honors Leonard Rieser (1922-98), an accomplished physicist, activist for the peaceful resolution of conflict, and professor emeritus of Dartmouth University, who was deeply committed to investing in the ideas and the potential of young people. The Leonard M. Rieser Research Fellowship will afford research and professional development opportunities for undergraduate students by providing them with funds to support unique research projects, internships, or travel expenses related to the above. The Fellowship is targeted toward students seeking to explore emerging or critical issues at the juncture of science, public policy, and international affairs and is notable as so few opportunities of this kind exist for talented undergraduate students. More details and application information can be accessed at http://www.thebulletin.org/fellowship.html. We ask that you post this announcement at your institution and that you promote it among students whom you feel would benefit from receipt of the Fellowship.

The Bible According to Einstein

Jupiter Scientific Publishing
Columbia University Station P. O. Box 250586
New York, NY 10025

In recent years there has been a resurgence of discussions of the relation between science and religion. We offer such evidence as the award of the Templeton Prize to physicist Ian Barbour (and earlier to Paul Davies),and the recent debate between Steven Weinberg and Anglican priest/physicist John Polkinghorne. Into this arena comes a peculiar book, The Bible According to Einstein, (1999, Jupiter Scientific Publications), subtitled A Scientific Complement to the Holy Bible for the Third Millennium, whose author(s) choose to remain anonymous. The dust jacket quotes a reviewer as saying, with not just a little hyperbole, "[It] promises to do for science what the Holy Bible has done for Judeo-Christian religions." The majority of the book is about physics and astronomy, and it appears to be sound, up-to-date, and thorough, at least to the extent that a book at this level can be. The dust jacket carries the praise of two Nobel Laureate physicists (Glashow, Ting) and one Nobel chemist (Seaborg), as reassurance. The parts of the book that are not physics or astronomy are mostly devoted to geological and biological evolution, from the beginnings of life on earth to modern man, and those parts conform to what we believe a majority of workers in those fields would accept (and the dust jacket carries the blessing of a leading anthropologist). If this were all, the book would be a welcome addition to the field of science popularizations, one of the few to attempt such sweeping coverage.

We must however, examine the text more closely, because its title raises more than one yellow flag of caution, and these are substantiated in its content as well as its unconventional style. First, we object to the abuse of poor Einstein's name and/or image as a marketing gimmick, whether by a local bank, by Apple Computer, or by the author(s) of this book. Despite one reviewer's comment that "...Einstein would have been proud of this book," we doubt that he would have allowed his name in the title if he had not written it. Certainly there are parts of the text to which he would have objected. In fairness, the author(s) do disclaim any connection with Einstein.

Secondly, there is the word "Bible." The Anonymous Author(s) (AA, which we shall construe as plural) rather disingenuously say that they are using the word "bible" (lower-case) in its original meaning of "a collection of books," yet they ape the Bible (upper-case) in many ways. After all, the stated goal of this book is "to present Nature's laws, as currently understood, in a style and format that is similar to the Holy Bible." The book has two main divisions, which, like the Christian Bible, are called "New Testament" and "Old Testament." Many of the chapters have titles like Genesis, Exodus, Deuteronomy, etc. Numerous passages are related to the Bible, or at least to religion, and this forces us to scrutinize the work in that context. In the preface, AA liken religion and science to "oil and vinegar" which do not mix, yet they attempt to blend the two ingredients into a "palatable mixture." The result, however, may not be to everyone's taste. Furthermore they describe science and religion in ways that are so parallel, that the reader may be left with the feeling that science is a religion, with "Nature" substituting for a deity and the word "natural" for "holy." Indeed AA say "...science operates to a large extent on faith." Physical laws are described as "dogmas." This view we feel is dangerous, fueling both the social constructivists' side of the current controversy on the nature of science, and the religious fundamentalists' arguments against the theory of evolution as a faith-based "mere theory," and against Secular Humanism as another religion.

In BAE the "New Testament" comes before the "Old Testament" (although AA say that in a future edition they may reverse the order). The main focus of this part is to present current knowledge of physics, from quarks and superstrings to macroscopic objects, from Newton's Laws to general relativity and quantum mechanics, to the big-bang (including inflation and dark matter, but not the cosmological constant). There are three main sections: "The Books Containing the Chronicles," "The Books of Physics" (followed by two short chapters on chemistry and biology), and "The Books of the Solar System". It begins, however, somewhat mysteriously with "The Twenty Second Book of Creation and the First Book of the New Testament, called Homogenesis." (The Twenty-first Book is located deep within the Old Testament portion.) "Homogenesis" traces the evolution of humankind from the australopithecines to homo sapiens sapiens. The following Books of Physics and Books of the Solar System treat in considerable detail our present understanding of these subjects, and apart from style, are well done. The Books Containing the Chronicles however is a mish-mash of biographical sketches (of Newton, Darwin, and Einstein, as well as of four religious leaders: Moses, Buddha, Jesus, and Muhammed), together with discussions of units, histories of electromagnetism and elementary particles, tables of quarks, an accounting of many natural catastrophes, etc.

The sketch of Moses we found especially troublesome. Based on the Biblical account, AA attempt to give "scientifically plausible explanantions" of Moses' magic and of the ten plagues upon Egypt. This treatment echoes the efforts of the Falasifa, the Islamic and Jewish scholars of Golden-Era Spain, and their intellectual heirs, the medieval Scholastics of Western Europe. Both groups attempted to reconcile Aristotlelian science with the texts of Divine wisdom. BAE treats Moses as a scientist: "...he was trained in science and taught of natural phenomena" [italics theirs]. He is a conveyor of scientific knowledge to the Hebrews: "...he explained how Nature worked. And the Hebrews became the wisest people anywhere." But at the same time, in this effort, AA have reduced Moses to a deceitful trickster, who uses his scientific knowledge (which he learned from the Egyptians) to fool the Pharaoh. AA portray Moses as no more powerful than Pharaoh's magicians (who are apparently unaware of the same knowledge they imparted to Moses). Like the Connecticut Yankee in King Arthur's Court, Moses knows that a solar eclipse is to occur, and threatens the Pharaoh with the plague of darkness. More disturbingly, AA have Moses and his followers purposefully poisoning the cattle in the fields, collecting rat lice to plague the populace, and finally, spreading "a disease like small-pox" to kill many of the innocent of Egypt. Moses has become a terrorist! This is all too close to the medieval anti-Semitic canards of Jews poisoning the wells and bringing on the bubonic plague. It should be noted that, by contrast, Jesus' "miracles" are accepted uncritically: he raises the dead, walks on water, etc. He is clearly not presented as a scientist-magician. Unlike Moses, Jesus is allowed to speak to God, and much of the treatment has a flavor of New Ageism. (More New Ageism can be found in the "must read" last chapter called "The Last Commandment.") Throughout their treatments of Moses and Jesus, AA make gratuitous additions to the stories which are without apparent purpose (Moses is "baptised" by his mother's tear, Jesus builds three tabernacles, etc.). We leave it to others to comment on the treatments of Buddha and Muhammed, although it should be mentioned that AA accept for Muhammed (like Jesus) what they deny to Moses -- that he spoke with God. We would however, recommend that AA delete all four of these biographical sketches in any future edition to avoid controversy and gratuitous offense, and since they are not relevant to the avowed purpose of the book.
The "Old Testament" half of BAE is a straightforward chronological history. It begins at the Planck epoch, progresses through to the formation of the Earth, the appearance of life, (where it loops back to the opening of the New Testament part), the geological ages, etc.

Written in a kind of imitation King James style, with some degree of poetry but without the majesty, BAE becomes tedious to read at length, since saturation of sentences beginning with "Now..." or "And ..." and sprinkled liberally with "...shall be..." soon sets in. One example chosen randomly from several hundreds (p.178) should suffice: "Now there shall be three fundamental laws of classical mechanics, known as Newton's Laws of motion. And they shall govern the movement of macroscopic bodies." This style, coupled with their prefatory remarks cited previously, have the tendency to make the reader feel that what is presented is revealed truth, and not the result of centuries of labor by countless minions of science.
All-in-all, it is regrettable that this highly pretentious yet remarkably complete survey of science at the end of the millennium cannot be recommended without many caveats.

Michael Lieber, Dept. of Physics,
University of Arkansas; Fayetteville, AR 72701
Rabbi Laura Lieber, Divinity School, University of Chicago

Beyond Malthus: Nineteen Dimensions of the Population Challenge

by Lester R. Brown, Gary Gardner, and Brian Halweil, Norton, New York, 1999, 167 pages, $13.00. ISBN 0-393-31906-7.

This review was published in the Teachers Clearinghouse for Science and Society Education Newsletter, in the Fall 1999 issue, and is reprinted here in condensed form by permission.

This slim volume, in the Worldwatch Environmental Alert Series, examines and provides the latest data on factors, or "dimensions," affected by the insensate growth of the world's population. It is a reasoned, detailed, sobering presentation.

"The Population Challenge," first of the book's 21 short chapters, discusses the "demographic fatigue" of many countries. The pace of population growth staggers the imagination. More people were born since 1950 than in the preceding millions of years since our ancestors stood upright. In the next 50 years another 2.8 billion people will be added. The UN has reduced its predictions of population twice in the last ten years, not because of a decline in births, but because of an increase in mortality. Thirty-two countries, mostly European, have achieved population stability. Another thirty-nine (including China and the U.S.) have achieved replacement level, but the population will continue to grow because of their young average age and (in the U.S.) immigration. Another 118 may double or triple before 2050, although population growth will be cut short for many of these because of increasing deaths caused by resource and other shortages.

The book examines 19 environmental and social dimensions that interact with population growth, any one of which could trigger a "demographic train wreck." Here are a few of these dimensions.

Grains: Since 1984 the grain harvest per person has droppedby 9% due to lack of new land and slower growth of fertilizers and irrigation. A poor country such as India uses less than 200 kilograms of grain per person per year, all by personal consumption. The U.S. uses 900 kilograms, mostly to produce meat and dairy products. The question "how many people can the Earth support?" cannot be answered without specifying the level of consumption. If the world grain harvest were expanded to two billion tons, it would support 10 billion Indians or 2.2 billion Americans.

Water: Spreading water scarcity may be the most underrated resource issue in the world today. The authors predict that the amount of available water per person will fall by 73% between 1950 and 2050! Water tables are being depleted in every continent, and aquifers are being emptied. As water is rationed, agriculture will not be favored, since 1000 tons of water can produce either one ton of wheat worth $200 or industrial output worth $14,000. As demands for industry and urban water use water rise, countries typically decrease irrigation water and import grain. In 2050 a billion people will be living in countries facing absolute water scarcity--but this means that we also face a future of food scarcity!

Biodiversity: We live in the greatest extinction period since the disappearance of the dinosaurs. The reasons for the extinctions are all a function of human activities.

Energy: Energy use is increasing twice as fast as population, a disparity that will continue as developing nations try to emulate the industrialized nations. Energy consumption of developing countries will increase by 336%, and that of the industrial nations will double despite a population decrease. The U.S. uses twice that of other industrial nations and 13 times that of developing nations.

Fish: During 1950-1988 oceanic fish catch increased from 19 to 88 million tons, a per capita increase from 8 to 17 kg. Overfishing has now become the rule. Of 15 major oceanic fisheries, 11 are in decline. The cod catch has dropped 70%, and Canada and the U.S. have had to curtail their fishing fleets. The competition has led to more than 100 international disputes in 1997 alone. The only solution in sight is aquaculture, which has increased from 7 to 29 million tons during 1984-1997. But aquaculture competes with livestock and poultry for grain, soybeans, and fishmeal.

Infectious Diseases: Steadily increasing urbanization in developing nations is accompanied by increased exposure to disease. The basic problem is urban overcrowding. Human activities linked to population growth, such as forest clearing and dam building, favor disease vectors. In many countries, most of the population lack access to basic health care.

Cropland: In fast-growing countries, per capita cropland is diminishing and food self-sufficiency will soon be impossible. Since 1950, global grain area has grown 19% while population has grown 132%! In crowded industrial countries such as Japan and Taiwan, the per capita grain area is smaller than that of a tennis court, driving these countries to import 70% of their grain. And population growth itself reduces cropland productivity and removes cropland from production.

Forests: 75% of forest losses have occurred in this century, as has 75% of the population growth. The forest loss is correlated with rising per capita consumption. Global paper consumption, 63% of it from Europe, Japan, and North America, has nearly tripled since 1961. Global forest product use is near or beyond sustainability, with enormous consequences for greenhouse warming and erosion control.

Climate change: All major scientific bodies accept that the world is warming due to greenhouse gas buildup in the atmosphere. Possible effects include more intense heat waves, more severe droughts, more violent storms, and more forest fires. The range of tropical diseases will be extended. Although the industrial nations are presently the chief emitters, developing countries will catch up by 2020.

Conflict: Conflict within and among nations stem primarily from conflicts over . shared natural resources, especially water. As population increases, so will the number of people who face resource scarcity and therefore the potential for conflict.

Nine other population-related dimensions covered in the book but not included here due to space constraints are: materials, urbanization, protected natural areas, education, waste, meat production, housing, jobs, and income.

In their conclusion the authors wonder whether the expected population increases can materialize. Because of "demographic fatigue," countries are unable to respond to infectious diseases, aquifer depletion, deforestation, etc. etc. Thus, countries might reach population stability or decline because of rising death rates--an undesirable way to solve the population problem!.

As governments' inability to handle social stresses becomes more evident, religious, ethnic, and tribal differences will be exacerbated, and conflicts will result. In all this, it is important to remember that our world is more environmentally and economically interdependent than ever. There are no longer "their problems." There are only "our problems."

As can be noted from this review, this book examines every possible aspect of world population growth in very succinct fashion. We can only hope that the myriad problems facing both developing and industrialized countries can be tackled successfully--in time!.

Irma S. Jarcho
The Trevor Day School
1 West 88th St New York, NY 10024.

Which World? Scenarios for the 21st Century

by Allen Hammond, Island Press/ Shearwater Books, Washington, DC, 1998

Hardcover about $17 if purchased from amazon.com

Allen Hammond is senior scientist and director of strategic analysis at the World Resources Institute and a prolific author. In this insightful and challenging book about likely futures he condenses the results of Project 2050, a 5-year international and multi-disciplinary "visioning" study co-sponsored by the World Resources Institute, the Brookings Institution, and the Santa Fe Institute, coordinated by the Stockholm Environmental Institute. The book acknowledges contributions from numerous study participants, and is copiously referenced and annotatedfor readers interested in delving in greater depth into its assumptions, projections, analysis, or findings. On the verge of the new millenium, futurism and strategic planning through scenario- building are a fashionable and timely endeavor both for private business planning and for federal agencies in response to Government Performance and Results Act(GPRA) requirements.

After a brief introduction in Part I, describing the scenario-building process, its value and limitations, the book sets up in Part II three future world scenarios considered most plausible. These are(1) Market World, ruled by free market economics, in which prosperity for all is the promise of progress; (2) Fortress World, marred by conflicts and instability, in which secure boundaries and deterrence maintain unstable equilibrium; and (3) Transformed World, a utopian fusion of new-wave capitalism, democracy and prosperity, enabled by instant communication via a GlobalNet and by enlightened technological progress.

These scenarios differ drastically in degree of realism, optimism,or pessimism underlying their projections. They also differ in the extent to which the distributed benefits of technology-based economic growth, rational social engineering, and institutional progress prevail over environmental neglect, poverty, social instability and factionalism. The scenarios are admittedly oversimplified, and the author acknowledges that our world is a very inhomogeneous, unpredictable and complex system. Yet,this is a useful attempt to structure and discipline unfettered futurism and to capture some general traits of the world half a century hence.

It is not until Part III that the trends critical to shaping an uncertain future are discussed. Part IV explores regionally-specific characteristics, the inertia in social and governance traditions, and culturally dictated choices. Only in the final chapter (Part IV) are the daunting challenges of "Choosing our future" posited.

As a strategic planner, I found the book interesting and informative, but rather simplistic and predictable in extrapolating recent trends into the far future. In my view, the book is an inverted pyramid: I would have preferred to turn the top-down chapter order around, so as to build a more compelling case for the three alternative worlds from the ground up. The future cast would have been more convincing were it to start with projected trends and with social, cultural, economic and other regional differentiators and show how they lead to the three common world scenarios. What is really missing is some sort of decision trees leading to actionable choices, which are implied but never made explicit by the author.

Many questions came to my mind and remained unanswered in reading the book: Is a homogeneous "global village" a bane or a boon? Will there always be winners and losers, rich and poor countries, or will there be a more uniform distribution of wealth across nations? Since there is no world governance, how can there be consensus on a new world order? How can we optimize a system currently sub-optimized even at the local level? Even if we perceive and agree on the desired final state of the world in 2050,how do we get from here to there, manage the change and overcome transition turbulence? Can the world we live in evolve smoothly from nationalism and factional rivalries and forsake religious extremism for an enlightened self-governance? Will the UN, the World Bank, and other global entities play a role, and if so how might they evolve? How can the identified positives (the information age, the "greening" of global corporations, effective and militant citizens groups, and mega-philantropists) turn our future into a better place? What choices and decision points exist and for whom?

The best thing about this thought-provoking book is that it gives the reader an opportunity to get on internet to browse trend data, regional scenarios and sources, and interactively exchange views on "What will the world be like in 2050" in a well organized "HyperForum on Long Term Sustainability," maintained by Caltech at http://www.hf.caltech.edu/whichworld, with a portion on "Which World" at hyperlink http://mars3.gps.caltech.edu/hf/b3.

Dr. Aviva Brecher
Transportation Strategic Planning and Analysis Office
John A Volpe National Transportation Systems Center, Cambridge, MA 02142

American Association Of Physics Teachers Statement On The Teaching Of Evolution And Cosmology

The following statement was adopted by the Executive Board of the American Association of Physics Teachers at its meeting in College Park, Maryland on 16 October, 1999.

The Executive Board of the American Association of Physics Teachers is dismayed at the action taken by the Kansas State Board of Education to eliminate the most significant portions of the subjects of evolution and cosmology from the science standards which define educational objectives in the state.

Evolution and cosmology represent two of the most sweeping and unifying concepts of modern science. There are few scientific facts more firmly supported by observations than these: Biological evolution has occurred and new species have arisen over time, life on Earth originated more than a billion years ago, and most stars are at least several billion years old. The overwhelming evidence comes from so many and diverse sources - biological knowledge of thestructure and function of DNA, geological examination of rocks, paleontological studies of fossils, telescopic observations of distant stars and galaxies - that no serious scientist questions these claims; we do our children a grave disservice if we remove from their education familiarity with the evidence and the conclusions. The "Big Bang" theory of the origin of the universe is, to be sure, not quite so firmly established, and some scientists still consider alternatives. Here, too, however, the framework for skepticism and challenge is examination of scientific observations and the proposal of testable alternatives, not simply the rejection of the conclusions that have been reached by most scientists.

No scientific theory, no matter how strongly supported by available evidence, is final and unchallengeable; any good theory is always exposed to the possibility of being overthrown by new observational evidence. That is at the very heart of the process of true science. But to deny children exposure to the evidence in support of biological evolution and of cosmology is akin to teaching them that atoms do not exist, that the Sun goes around the Earth, and that the planet Jupiter has no moons.

The Kansas State Board of Education has a responsibility to ensure that all Kansas children receive a good education in science. The American Association of Physics Teachers urges the Kansas Board to rescind its action which removes a significant portion of good science from the Kansas standards for science education.

American Association of Physics Teachers

"Evidence puts Dolphin in New Light as Killers"

The Science Times (New York Times, July 6, 1999) blared out this headline, and more: "Smiling mammals possess unexplained darker side" and "wild dolphins are seen as friendly, but swimmers report being bitten and bumped." Then to add insult to injury, the NYT/Science tells us that the Dolphins are not rocket scientists, but rather D-minus students by stating: "Language use is unlikely" and that compared to man's great left-right neural brain, "the two are not comparable in areas like problem solving." This is demoralizing news. I was one of those who believed that it was the Dolphins who brought the Berlin Wall down, scrapped 1/2 of the SS-18s and all Minuteman-IIs, and created an inspection regime of 13 parts including counting RVs. As most of you know, Leo Szilard wrote "The Voice of the Dolphins" which showed the way for the Dolphins to do their good work. And now we are told that the Dolphins are neither "nice" or "bright." Did Szilard miss the character flaws of the Dolphins? Should the Forum's Szilard Award be represented by a different symbol? What should we do? If not a Dolphin, then a ladder, or teddy bear? But then it came to me. Szilard was very bright indeed. He knew about Darwin's "On the Origin of Species by Means of Natural Selection." He knew for the fittest to survive that they had to knock heads and have a tad of aggression. He also knew that the dolphins aren't rocket scientists or string theorists. Szilard knew that the Dolphins could not instruct the communist and capitalist scientists in the ways of RVOSI's (re-entry vehicle on-site inspections). Thus, the common view is wrong, that the Dolphins are very bright and mighty nice, and with these attributes the Dolphins saved society from nuclear Armageddon. We need a new theory, law, axiom, or paradigm to carry us beyond this catastrophe.

Here is THE alternative explanation of the truth that I am sure Szilard understood, but didn't tell anyone, not even his wife Trudy. Szilard knew that both the communist and capitalist societies sometimes lose their way in political-macho debates. Szilard knew that 30,000 + 40,000 = 70,000 total nuclear warheads at one time, and 2 x 10,000 targets was beyond comprehension. Szilard knew that he had to build up a false image of the Dolphin, and place it in the spotlight. Such a paper or straw Dolphin has allowed the dedicated peaceful scientists quiet time to work behind the scenes with the real RVOSI plan. And that, Ladies and Gentlemen, is the truth. Good old LeoS, he knew that the false image of brilliant-nice Dolphins would be the "fig leaf" behind which those good-old START negotiators could do their handiwork. So my conclusion is that the Dolphin should be praised even more, and not less. We should realize that the Dolphins agreed to play this role, demeaning their dignity as honest dolphins, so that the Berlin and US/USSR walls could come tumbling down. So, let's keep that shiny Dolphin as the symbol for the Szilard Lectureship since we know that the dolphins gave so much to all of us.

David Hafemeister,
Physics Department, CalPoly University