Archived Newsletters

Do Lay Science Books Help Lay Science Knowledge?

I read with great personal interest Al Saperstein's review of Science and Religion: An Evolutionary Perspective, by Walter Isard, in Physics and Society, Vol. 27, No 2. I am personally acquainted with persons who have been seriously mislead by such books. Usually, a superficial resemblance between physics and other disciplines is presented as profound, or a superficial explanation is presented as complete. I believe that there may be serious consequences which should be investigated.

As anecdotal evidence I offer the following: At least one person has wasted 20 years believing that the derivative of t with respect to t proves that time goes in one direction because the answer is +1. A Unitarian minister believes that all elementary particles are made of love. He presented this to his congregation and to the larger community in an interview on local television news. The worst example was the presentation of Margaret Wheatley's book, Leadership and the New Science, as a source of insight about the latest developments in science and their application to leadership issues by the Kentucky Department of Education. This book implies, among other things, that an experimenter can choose the outcome of experiments in quantum mechanics (in the Schrodinger Cat example). I took the initiative and debunked the KY DOE in a formal presentation to about 100 people, but the material had probably been recommended to 10 or 50 times that number of professional educators and parents in the state.

So I want to challenge the Editors to take seriously the objective of the APS: "... the advancement and diffusion of the knowledge of physics." Let's find out if popularizations are working. Create and fund a group to formally study the knowledge of physics and its presentation to the public. Let's make real measurements. Who reads these books? What do they get from them? Is there a noticeable improvement in their knowledge after exposure? Is there a measurable change in their behavior after exposure? What needs to be done to effectively communicate to the public? What does it mean to be educated well enough in science to participate meaningfully in our society? Do these books help or harm the public in a democracy like ours?
I have seen investigations of whether people know what Evolution and the Big Bang mean. I want to know if they understand enough about their car or furnace to evaluate claims made by sales persons or service representatives. Do they understand that physics has something fundamental to say about biology and that biological organisms, including humans, are subject to physical laws?

Robert M. Riehemann
Cincinnati Bell Information Systems

Editor's Comment (MS): Editors do not do studies; individual members do; POPA does. We've often wondered what the impact of books such as those relating quantum mechanics to Eastern mystesism have on the public understanding of physics; it may be significant and detrimental. So studies by our readers are very welcome. But what can we do about such books, aside from some sort of APS rating system (after all there is the First Amendment to worry about)?

Low Level Radioactive Waste Repositories

In the April issue of Physics & Society John Ahearne suggests that a solution to the building of Low Level Radioactive Waste (LLW) repositories might be to not build them for nuclear reactor wastes. In coming to this solution John quotes the anti-nuclear opposition, and wonders if they would then allow repositories to be built for non nuclear reactor radioactive wastes.

Is Dr. Ahearne too narrow in his approach? Would not a broader solution be to shut down all nuclear plants and outlaw nuclear power? And with organizations such as Greenpeace and the Sierra Club, and non technically knowledgeable politicians such as Senator Barbara Boxer, who Ahearne quotes, we could go even further in the energy field by supporting them in eliminating hydroelectric dams and fossil fuel burning. With one exception we would then have solved the world's energy problems and would likely get the support of the anti s for living in a wonderful world of conservation, solar, and wind power.

Of course, the one exception would be the deaths and devastation in the world due to lack of needed energy!
We in the technical community have done a poor job of bringing technological information on energy to the public in a sound manner. All technologies from gas lighting to steam heat to electricity have had problems, and have scared the public at their initiation. But with time, the public was educated about their risks and benefits. In the case of nuclear energy we haven t done this. Even our President and Vice President appear ignorant when they express great concern over global warming while ignoring, or opposing, the only available solution - a major expansion of nuclear energy. It might in passing be noted that nuclear energy, meeting U.S. and Western standards (Chernobyl did not), has not harmed a single member of the public.

The Ward Valley LLW repository has been studied by California State organizations, by federal organizations, and by the National Research Council in a 15 month study. It s safety and suitability have been upheld. And indeed, if one reads the articles on low level radiation in the same Physics and Society issue as Ahearne s article, then one can conclude that Ward Valley risks may be a thousand or more times smaller than the negligible risks projected by the past studies.

If this nation surrenders to the irrational objectives of the anti-technology, anti-nuclear organizations we will be failing our generations to come. This is an irresponsible course. We have reached the point where those with scientific and technological knowledge and expertise, and their organizations, should work aggressively to provide lawmakers and the public with the information needed to allow them to choose a sound, vital, energy course. On our present course we threaten the future of our nation s children, and the welfare of the world.

Bertram Wolfe
Retired VP & General Manager of GE Nuclear Energy
15453 Via Vaquero,Monte Sereno, CA 95030
Phone and Fax: (408) 395-9039

John Ahearne suggests that opposition to development of new low-level radioactive waste disposal facilities might be muted if use of thesefacilities were restricted to non-reactor wastes. ("A Partial Solution to LLW Siting?", Physics and Society, April 1998.) Ahearne writes, "...it would be useful to find out if those objecting to current siting do so because of opposition to nuclear power or because of fear of radioactivity of any kind."

Exploration of opposition attitudes for the purpose suggested by Mr. Ahearne would cost years of additional delay while implementation of state and federal laws was held in abeyance. And for what?

Opposition groups act on the theory that if they can block safe disposal of low-level waste, they can block all uses of radioactive materials. The opposition to the proposed Ward Valley disposal facility in California, and to proposed disposal facilities in other states, is driven by ideological opposition to any beneficial use of radioactivity by society - not by a concern for safe disposal of the waste byproducts. Some opposition groups admit to this objective. For example,Greenpeace, in a solicitation letter from its Executive Director in the spring of 1995: "There is only one answer -- turn off the faucet; stop producing nuclear waste. Period."

The Washington, DC based Nuclear Information Resource Service, in a fundraising letter from its Executive Director dated October 1994: "We've helped stop every proposed radioactive waste dump for the past several years. Federal law said that every state should have had a dump to send itswaste to by December 31,1992. Federal law was wrong and grassroots activists were right. No new dumps are operating, and, as long as radioactive wastegeneration continues, forever exacerbating the problem, we will work to ensure that no new dumps ever operate."

And Greenpeace, again, in a handout distributed while picketing a Cal Rad conference in San Diego in 1992: "Developing alternatives to radionuclides currently used in medicine and industry must be a priority."

Actions also reveal motivations. Groups opposed to the Ward Valley project tried to convince California's Courts that the California Department of Health Services followed inadequate procedures when licensing organizations to use radioactive materials. Fortunately, the Courts rejected this argument. Had these efforts succeeded, not only would the Ward Valley license have been invalidated but so would over 2,000 radioactive materials licenses in California for hospitals, universities, industries, etc.

Opposition groups might profess agreement with the Ahearne proposal. But if it were ever seriously pursued, we would soon hear demands for yet anotherrestriction based on the wastes' institutional pedigree -- creating another category of orphan waste.

Mr. Ahearne eloquently presents sound arguments against his own proposal! It enables a public policy issue -- supposedly settled by Congress, state legislatures, and federal and state regulatory agencies -- to be dodged. It increases disposal costs for the non-reactor users of radioactive materials (universities, industries, medical centers, etc.). And it merely postpones decisions about disposal of the LLW generated by reactors.

The laws on the books are more than adequate; it is political will that is inshort supply. We should demand that federal and state officials have the courage to "take care that the laws be faithfully executed."

Alan Pasternak
Technical Director, Cal Rad Forum
P.O. Box 1638, Lafayette, CA 94549-1638
(510)283-5210; FAX 283-5219

Editor's General Comment (ES): Scientists and the technologically literate do themselves a disservice when,in their attempts to educate the public and enter the public debate on issues such as nuclear energy, they disparage the public for its very real fears. Whether the fears are based on facts, misinformation, or misunderstanding,they cannot be erased by assertions which might be interpreted as implying the stupidity of the people that hold and promote them. Perhaps compromise does not provide the most elegant solutions, but proof by assertion is similarly flawed.

Wanted: Interesting Memorabilia

At the March 1999 Centennial meeting of the APS, the Forum on Physics and Society will have a display reminiscing over our actions of the past quarter century. The Forum has done much to be proud of and we would like to tell the world (or at least the APS). We would appreciate your sending your interesting photos, or other items from our sessions, short courses, P&S articles, Forum books to D. Hafemeister, Physics Dept., California Polytechnic State University, San Luis Obispo, CA 93407. Thanks

David Hafemeister

Burton-forum And Szilard Award Nominations

Nominations for the Burton-Forum Award (formerly the Forum Award) and the Szilard Award are due July 1, 1998. The call for nominations has gone out to the general membership but has not yet resulted in any nominations. I would like FPS, which has been closer to the activities for which the awards were created, to think of possible nominations. Please send me any that you generate at the address below.

To remind everyone:

The Burton Award is to recognize outstanding contributions to the public understanding or resolution of issues involving the interface of physics and society. Previous winners can be found at http://www.aps.org/units/fps/fawards.html

The Szilard Award is to recognize outstanding accomplishments by physicists in promoting the use of physics for the benefit of society in such areas as the environment, arms control, and science policy. Previous winners can be found at http://www.aps.org/units/fps/szilard.html

Looking forward to some good nominations. Thanks!

Mark Sakitt
Department of Advanced Technology
Building 197C
Brookhaven National Laboratory
Upton, N.Y. 11793-5000
Phone (516) 344-3812
Fax (516) 344-5266

Members of the Forum are also reminded to think about, and submit, nominations for Fellows of the APS via our Forum. Contact our Fellowship Chair, Pricsilla Auchincloss, at the Univ. of Rochester,BITNET:PSA@UORHEP.

armd@physics.wm.edu

Articles

The "cold war" is now officially over - what happens to physics now?. While it was "on", it provided great professional challenges, and ample economic rewards, to the physics community - on both sides. The major investments in science also had great impact upon the general society, as sketched in the recent talk by Burton Richter, reprinted below. But, in spite of its apparent growing disinterest in science (and increasing fascination with pseudo-science) society still needs us: there are important, challenging problems which society needs us to address. Some are the result of past "hot" wars - as illustrated by the Kosta Tsipis article on the clearing of land mines (recently a major item in the news due to U.S. refusal to sign a treaty banning such mines). Some utilize the competencies and tools we have developed while waging the "cold war", e.g., the piece by Goldstone, et al, sketching the use of weapons lab computer facilities to address major environmental questions. And some represent the continuation of "cold war attitudes", as illustrated by the secrecy article by Aftergood. This latter problem has recently manifested itself, in the pages of this journal, via the unresolved dispute between Miller and von Hippel - claiming that useable nuclear weapons can be made from reactor-grade plutonium, which thus should limit its civil/commercial use - and DeVolpe - arguing that such materials are not a source of significant weapons and hence their commercial use presents little proliferation threat. Until classified files on this matter are opened, the debate - important for our future energy/environmental concerns - is a dialogue of the deaf.

(Editor's Note: The following piece is a record of the remarks made by Burton Richter, Director of the Stanford Linear Accelerator Center, and past president of The American Physical Society (1994), to the Senate Forum on "Research as an Investment.")

Long-Term Research and Its Impact on Society

Burton Richter

It is a privilege to participate with this distinguished group in this forum on Research as an Investment. My perspective is that of a physicist who has done research in the university, has directed a large laboratory involved in a spectrum of research and technology development, has been involved with industries large and small, and has some experience in the interaction of science, government and industry.

Science has been in a relatively privileged position since the end of World War II. Support by the government has been generous, and those of us whose careers have spanned the period since World War II have, until recently, seen research funding increase in real terms. Support for long-term research really rested on two assumptions: science would improve the lives of the citizens, and science would make us secure in a world that seemed very dangerous because of the US/USSR confrontation. The world situation has changed radically, both politically and economically. The USSR is no more, and economic concerns loomed much larger as our deficit grew and as our economic rivals became much stronger. It is therefore no coincidence that federal support for long-term research peaked in the late 1980's (according to the National Science Foundation's science and engineering indicators) and only biomedical research has grown in real terms since that time.

The emphasis of this forum, the economic value of public investment in long-term research, looks at only one of the many dimensions in the impact of research on our society. In examining that dimension, it is important to understand the time scale involved. Product development, the province of industry, takes technology and turns it into things which are used in the society. Typically, these days, the product development cycle runs for three to five years. However, the research that lies behind the technologies incorporated by industry into new products almost always lies much further back in time--twenty or more years. I'd like to make four brief points and illustrate them with a few examples:

  • Today's high-tech industry is based on the research of yesterday.
  • Tomorrow's high tech industry will be based on the research of today.
  • The sciences are coupled--progress in one area usually requires supporting work from other areas.
  • Federal support for research has paid off and will be even more important in the future.

Technological Innovation in Humanitarian Demining

Kosta Tsipis

Introduction
In 1993 the U.S. State Department publication "The Hidden Killers, The Global Landmine Crisis" pointed out that there are about 120 million landmines still buried in 62 countries, potentially lethal remnants of armed conflicts over the past half century. Even though the combatants in these wars did not generally intend to harm the civilian population, abandoned landmines now kill or maim about 30,000 persons globally every year, 80 percent of them civilians.

Four hundred million mines were deployed between 1935 and 1996; of these 65 million were emplaced during the last 25 years. Various nations currently manufacture 7.5 million mines annually. In 1993 alone, according to U.N. estimates, 2~5 million mines were laid. In the same year, only 80,000 were removed. It is estimated that 100 million mines are currently stockpiled around the world ready for use.

Mines are durable objects that can remain active for decades. They are manufactured in large numbers by many nations including the U.S., Russia, China, Italy, and a hundred other suppliers. Their cost varies mostly between $3 and $15 each. A few may cost as much as $50 each. A mine usually consists of a casing (metallic, plastic, or even wood); the detonator, booster and main charge; the fuse; and sensors that range from a simple pressure plate or a trip wire to more sophisticated triggers. a collapsing circuit, pressure-distorted optical fiber, pneumatic pressure fuse, or various influence sensors -- acoustic, seismic, magnetic, thermal. The most common triggering mechanisms depend on pressure --5 to 10 kg of force -- applied on the top of the mine.
In addition to the human toll landmines claim in many, mainly poor, underdeveloped areas (in Cambodia an incidence of one amputee per 250 people has been caused by land-mine accidents). their negative effects are multidimensional. Landmines can, over the long term, disrupt normal economic activities, such as travel and transport, and deny land to farmers, in turn often causing malnutrition, hunger, or migration of agrarian populations to urban centers. Clearance is not only a safety issue, but an economic and social issue as well.

Demining Operations; Current Practices
Demining operations differ sharply according to their purpose. Tactical demining, including minefield "breaching," aims at rapidly clearing a corridor for combat use through a minefield during battle, often in hours. "Tactical countermine" operations aim at the removal of most mines by military personnel from areas occupied by the military over days or weeks. "Humanitarian demining," the subject of this paper, involves the peacetime detection and deactivation over a considerable time of all mines emplaced in an area.

Because most mines have metallic casings or contain at least a few grams of metal (usually the firing pin and associated spring) the standard method of detecting mines either buried or hidden in overgrowth is a pulsed-induction eddy-current sensor that can unambiguously detect the presence of less than a gram of metal buried in nonmetallic soils to a depth of 10-20 cm. The pulsed-electromagnetic induction (PEMI) detector applies a pulsed magnetic field (T ;.5 msec) to the soil. The magnetic field propagates into the soil and diffuses into buried conducting materials. Eddy currents are induced in the conducting material which in turn produce an opposing magnetic field, (T ;200 5sec) as the applied field collapses. This opposing field disturbs the magnetic field produced by the detector. Perturbations in the detector field indicate the presence of a metallic object buried in the soil and are signaled by an audible sound. In effect, such detectors can detect reliably most of the smallest antipersonnel mines buried close to the surface. But they cannot detect totally, or almost completely, metal-free mines.

But this method suffers from a major disadvantage. Metal detectors detect not only mines but all metallic objects in the ground. Since quite often mines are laid in or near battlegrounds, metallic detritus -- shrapnel, bullets, pieces of metal, screws, wires, etc. -- causes a false alarm ratio often higher than 1000 to 1. Therefore, one of the major technical challenges in humanitarian demining is discriminating between false alarms and real mines.

Discrimination is currently accomplished in a very slow and dangerous fashion. The metal detector locates the buried metallic object to within five cm. Such detection takes about a minute per square meter of terrain. Demining personnel then probe the spot with a rod (metallic, plastic, or wood) about 20-25 cm. long to determine whether the detected object has the characteristic size and shape of a mine or is instead a mere piece of scrap metal. Depending on the soil type and condition (hardened, overgrown, etc.), discrimination by probe can take anywhere between two and 20 minutes. Once a mine is confirmed, it now takes about 10 minutes to dig it out, another 10 to explode it in situ (creating additional metallic clutter), and 10 more minutes for the de-miners to walk away from the explosion and back. All detected objects must be identified, or even dug up to assure that no explosives have been left in the ground.

This is clearly too time-consuming a method of discrimination. With current equipment and practices the process can take about 30 minutes for every metal detector signal. Moreover, the use of a probe to determine the nature of an object detected by a PEMI detector does not tolerate carelessness or boredom. The resulting average casualty rate for this work is one injured or dead deminer per 1000 mines detected. But rates as high as an injury per 100 mines have been encountered.

An additional problem is that many types of mines are designed and constructed with very little metallic content; some are completely metal-free. Metal detectors are of little use for these types of mines.

The laboriousness and riskiness of the current canonical method of discriminating mines from false alarms, and the existence of non-metallic, or low-metal content mines, have led to new technologies for use in humanitarian demining. Some of these are evolutionary versions of older approaches, some are quite novel. Here I describe first detection/discrimination methods, based on the mine's explosive contents, in its solid or vapor state:

  1. Thermal neutron activation of the element nitrogen in explosives
  2. Back scatter of X-rays from plastic landmines based on the lower-Z contents of such mines compared to the Z of average soils.
  3. Nuclear Quadrupole Resonance properties of nitrogen nuclei in crystalline structures like explosives.
  4. High speed gas chromatography that detects explosives vapors (or particles?) emanating from buried mines.
  5. Arrays of organic polymers that can sense and identify vapors.

A second class of detection technologies is based on the fact that a buried mine represents a discontinuity of dielectric or conductivity properties in the soil. This approach uses:

  1. Ground penetrating radar or its cousin
  2. Microwave Impulse Radar to detect mines on the surface or buried in the ground.

A third class of detectors, based on the differences in dielectric and diamagnetic properties of materials, uses:

  1. Magnetoquasistatic detectors
  2. Electroquasistatic detectors

These two types detectors detect and discriminate metallic and non-metallic mines respectively from the clutter presented by the ground and its contents.

An entirely different approach is to detonate mines without detecting them, instead of using detection and discrimination methods to locate mines (which are subsequently destroyed). This capital-costly "brute force" approach involves the use of vehicles equipped with rollers or treads that detonate anti-personnel mines by riding over them. Application of this is limited by terrain, the potential presence of antitank mines that can destroy the vehicle, and the difficulty of assuring that all mines in a given area have been destroyed (on uneven ground, the equipment may not apply the needed pressure everywhere).

Before either approach -- detection/discrimination, or brute force neutralization -- can be used, it is necessary to find, and determine the boundaries, of a minefield. Perhaps even more important is the confident identification of areas that are free of mines. This is the second major hurdle for humanitarian demining: developing rapid and efficient area search methods that will reliably determine the presence or absence of mines. Currently finding minefields and determining their approximate boundaries, as well as declaring areas as mine-free depends on visual observation, history of mine accidents or records of laid mine fields. Specially trained dogs or simple metal detectors are now used for area surveillance, a slow and risky method.

Several technological avenues are being followed in pursuit of a satisfactory method for remote, rapid, and reliable minefield detection.

  1. Several passive IR systems relying on thermal images of mines or "scars" in the soil resulting from excavation to bury mines. Some of these systems are airborne (fixed-wing aircraft, helicopter) and some are vehicle-based.
  2. Multi-spectral and hyperspectral systems based on the resulting imagery.
  3. Airborne active laser systems based on detecting the reflected light.
  4. A helicopter-borne system using an active laser (1.06 5m) and a passive long wavelength IR sensor (8-12 5m). The system collected reflectance and polarization image data, and thermal emission data. The system incorporated real-time imaging and data analysis that automatically detected minefields.

Although this latter system was the most successful, it detected 99% of conventional surface-laid minefields but 66% of scattered mineflelds and 34% of minefields with buried mines.

Thus, the problem of rapid minefield surveillance remains active, and is being addressed. A fusion of synthetic aperture radar and a hyperspectral imager data is showing good promise.

To summarize the current state of humanitarian demining technologies: even though humanitarian demining has the dual advantages of time and low wage local labor ($1500 - $3000 per person per year), the currently used method is unacceptably slow, expensive., and dangerous. More important, it has only insufficient impact on the global landmine problem. More specifically, research and development in humanitarian demining needs to focus on four key areas:

  1. Efficient method to survey large tracts of land to identify confidently mined and mine-free areas, roads, etc.
  2. Improvement by an order of magnitude in the speed and safety of equipment and methods used in the current labor-intensive approach.
  3. Rapid and efficient methods to neutralize discovered mines.
  4. Development of advanced detection/discrimination technologies that can be mechanized and used with automated or even robotic systems-- which can replace the existing labor-intensive demining practices no later than five years from now.

The Next Steps
Humanitarian demining technologies in the near horizon easily fall in two groups:

technologies approaching maturity that can be applied to increase the efficiency, speed, and safety of the labor-intensive current demining methods and those, more promising perhaps, that won't be ready for field operations for half a decade or so.

In the first category I include the Meandering Winding Magnetometer (MWM) and the various configurations of the Interdigitated Electrode Dielectrometer (IDED), the air knife (already in multiple uses), the explosive foam "Lexfoam," and a family of smart probes -- acoustical, thermal, or magnetic. In the second category I put the Nuclear Quadrupole Resonance explosives detectors, the rapid gas chromatograph, and the polymer-black carbon composites arrays to detect explosives vapors, and the family of remotely controlled, automated, or robotic vehicles that can perform both the detection and the neutralization of landmines. The wide area surveillance system that uses fused SAR/hyperspectral imager data naturally falls in this second group.

The MWM can reduce false alarm rates by a factor of 10 within a year or so and consequently reduce the time spent for detection to 5-20 sec/m2 of searched terrain. The air knife, which uses high pressure air as a hand-held probe to uncover buried metallic objects (false alarms or mines), could replace the simplest manual probes and speed up discrimination of mines from metallic fragments by a factor of 5-10 while improving safety at the same time. The use of Lexfoam ($9 per pound) to blow up the exposed mine would speed up the overall demining process by a factor of 2 to 5.

I shift now to the more advanced, more promising, but still untested in the field, detectors of explosives. Their advantage is clear: only mines and other UXO would trigger the detectors, fusing the tasks of detection and discrimination into a single step and therefore speeding up humanitarian demining decisively. One key advantage of this approach is that its efficiency of detection is not dependent on the metallic content of the mine. Plastic and low metal-content mines can be detected and identified as well as metallic ones.

Nuclear Quadrupole Resonance (NQR) depends on the fact that some atomic nuclei, such as nitrogen, are not spherically symmetrical, i.e. they possess electrical quadrupole moments. Depending on what kind of crystalline structure nitrogen nuclei find themselves in, their non-sphericity produces a unique set of energy states characteristic of the crystalline structure when precessed magnetically. Using this property, an explosive in its solid phase can be identified by its nitrogen absorption radio frequency lines. The explosive RDX can be readily detected by NQR, but TNT (the most common explosive in mines) and PETN are more difficult to detect. Although TNT detection will require longer detection times, many seconds, or even several minutes, NQR detectors can be used in discriminating mines from false alarms.

Two-sided nuclear quadrupole resonance explosive detectors have been tested in airports where they detect quantities of RDX comparable to those in a mine in six seconds. But applications of NQR to mine detection will first require the satisfactory solution of several problems: a) A way must be found to improve the detectability of TNT by exploiting more absorption lines, and by improving the electronics and the detector coil; b) The possibility of interference from stray radio signals at the relevant frequencies must be reduced; c) Some method must be found to deal with the inhomogeneities caused by the one-sided detection geometry that mine detection dictates.

The fact that trained dogs can detect mines unerringly indicates that mines emit vapors that characterize them uniquely, though this may depend on how long a mine has been buried. In all probability these are explosives vapors that either escape from the interior of the mine or come from traces of explosives "smeared" on the outside of the mine during manufacture, storage, or emplacement. It is not clear that these vapors emanate in real time from the mine or are vapors from particles that have stuck to dirt or vegetation directly above the mine.

Arrays of sensors, each with some specificity to a particular molecule or compound, are quite commonly used in the food and perfume industries to identify constituent compounds of the product. One such sensor is under development at Cal Tech. It uses physical-chemical properties of carbon-black organic polymer composites to develop vapor-sensing elements, each sensitive to different molecules. The collective response of an array of such elements can identify the type of vapor. DARPA is actively pursuing an array sensor for explosives detection intended of airport use, but probably adaptable for humanitarian demining work.

An electronic vapor detector that claims detection sensitivity that is 1 order better than what dogs achieve (i.e. 10-20 picograms of TNT) has been developed. The detector initially collects particles on which the vapor is attached and then performs the vapor recognition by rapid gas chromatography and chemiluminescence. In principle, this technology could be adapted to a probe that could be inserted in the ground near a suspected mine to "sniff" the vapor aura of the buried mine. It is not clear to what degree explosives vapor remnants in old battlefields will generate unacceptable clutter for the electronic vapor detector. Detailed field measurements of the presence and behavior of explosives vapors will have to be conducted in support of the development of such a detector.

Conclusions and Recommendations
Several of the technologies described in this paper appear to promise decisive improvement in humanitarian demining operations. If these technologies are to mature into useful, affordable, field equipment, I believe we have to follow four general guidelines.

First, since there is no indication that a single entirely new, revolutionary approach to humanitarian demining lies just beyond a near horizon, efforts should focus on incremental improvements of the various demining operations, starting with the field use of better tools in the current deminer-intensive method, and gradually introducing new, sophisticated mine detectors. Since no single "silver bullet" will solve all demining problems in all cases, a spectrum of new detection and neutralization technologies should be developed, field-tested and applied flexibly.
Second, efforts and funding should focus on technologies that lead to systems that are easy to operate and maintain in countries infested with land-mines. Power sources for demining instruments must be portable, detectors must be rugged, and associated electronics must be impervious to humidity, dust, and temperature extremes.

Third, the magnitude and complexity of a systematic humanitarian demining campaign are so large that its goals cannot be achieved by earnest, even ingenious, efforts that remain un-coordinated. A coherent, systematic progression from measurements of physical and chemical properties of mines, followed by experimentation, equipment development and laboratory testing, then field testing in realistic conditions, modification, engineering development, production, distribution to users around the world, training of operators, and creation of a central but easily accessible data-bank of mine and soil properties and of the latest results of demining research have to be carefully organized, guided, supervised, and evaluated.

Fourth, the entire effort to develop demining equipment of gradually increasing sophistication and efficiency must be centrally coordinated, guided, and overseen. A central agent is needed to set research priorities, assign technical tasks, coordinate their implementation, and evaluate the results. In addition, such an entity could act as an advocate for humanitarian demining within the U.S. government. This proposed coordinating agency will need high-level technical and scientific advice. Such a need can be satisfied by the establishment of a properly constituted Science Advisory Board that will advise and provide information about relevant scientific and technical developments in academic and high-technology industrial laboratories.
Meanwhile, in parallel, stable, long-term, adequate funding for these tasks must be secured. This latter task implies the need for a persistent effort to inform and educate decision makers, opinion makers, and through them, the tax-paying publics of developed democracies in parallel with the scientific and technological efforts. I believe that several of the technologies examined here will work well in the field and therefore can be politically attractive to governments wishing to assume leadership roles in humanitarian activities.

Kosta Tsipis is with the Program in Science & Technology for International Security

Massachusetts Institute of Technology,Cambridge, MA 02139 USA

tsipis@athena.mit.edu

Stockpile Stewardship, Breakthroughs in Computer Simulation, and Impacts on Other Complex Societal Issues

Philip D. Goldstone, Donald R. McCoy, and Andrew B. White

Stockpile Stewardship is continuing to develop and mature as a national effort. While stewardship of the nations' nuclear weapons without nuclear testing has been a de facto reality since the last test in 1992, we recently marked the second year that the safety and reliability of the stockpile was certified to the President through Stewardship activities (this new annual certification requirement is tied to U.S. pursuit of a CTBT). An earlier article in Physics and Society described the need for the Stewardship program and its general outlines. Here we note some recent progress in the transition to a sustainable science-based capability for weapon safety and reliability assurance, and then focus on a key element of the Stewardship effort: the development and use of breakthrough computational simulation capabilities through the DOE's Accelerated Strategic Computing Initiative, or ASCI. We will also discuss why we believe these developments, driven by a need to support national security while reducing nuclear dangers, can have major effects on our ability to address other issues of great importance to society--for example, to more accurately and predictively model global climate and the consequences of human-caused changes.

Since Rocky Flats Plant ceased operation in the late 1980's, the U.S. has been without a functioning manufacturing capability for stockpile-qualified plutonium "pits." Even with arms reductions, reestablishing a small capability has remained necessary. Efforts to demonstrate that stockpile-quality pits for existing weapons types can be built and that rebuilt weapons with these components can be certified (for example to replace surveillance samples or aged weapons) recently achieved a milestone: the fabrication of the first demonstration pit for this purpose.

Subcritical experiments conducted at the Nevada Test Site, and other measurements, are beginning to provide improved data on plutonium properties to aid with assessments of aging phenomena and to support certification of remanufactured weapons. To provide a range of complementary studies of hydrodynamics and radiative interactions in high-energy-density conditions, the National Ignition Facility laser and the Atlas pulsed power facility are being built to replace their smaller predecessors, and the reconfigured "Z" pulsed power machine began experiments. (Atlas and "Z" use high-current induced magnetic pressure to provide precision microsecond hydrodynamics or multinanosecond x-ray experiments, respectively.) February 1998 marked first occupancy of the structure to house the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility, an x-ray radiography tool for dynamic experiments and hydrodynamic testing, and parts of the first-axis accelerator system are being installed for anticipated operation in 1999. Meanwhile an emerging technique, dynamic radiography using high-energy protons rather than x-rays, provided new data on high explosive behavior.

In addition to the mostly experimental activities just described, new simulation codes have been run on ASCI-derived computer platforms at Sandia, Livermore, and Los Alamos. While experiments, analytic models of underlying phenomena, and computer-based simulation are all necessary under a CTBT, simulation is now the only means to integrate our knowledge of nuclear weapons science and engineering to evaluate safety, reliability, and performanceas the weapons age, are repaired or remanufactured. While it is not truly correct to call this "testing in the computer" as the press occasionally does, the simulation is the only way all the pieces can be connected. The challenge of making serious predictions with serious consequences based on complex computer simulations, without a full-up experimental test of these predictions, stresses the simulationsas well as the rigor with which they are tied to experimental reality, and the degree to which their users appreciate that a misplaced faith in an inadequate simulation would be a failure mode. We do not belittle or underestimate this challenge, though we have confidence it can be surmounted. Therefore the accelerated advancement of almost all aspects of complex simulation science and technology through the ASCI effort, as well as the continuing need for experimental capabilities and activities, and preservation of nuclear-test data. We will focus on simulation and computing for the rest of this article.

ASCI WILL DRIVE BREAKTHROUGHS IN SIMULATION. ASCI's "accelerated" nature is driven by several real-world factors. The U.S. faces aging of the technical skill base that has direct experience in weapon design, development, and testing. Demographics imply that about half of that skill base will to be lost to retirement by 2004. In this same time frame, the average age of more than one stockpiled weapon system will begin to exceed its intended design lifetime; therefore, aging concerns may begin to occur more frequently. Based on these timescales we will need to materially complete a transition from the era of nuclear test-based certification to one of sustainable science- and simulation-based certification by about 2010.

To do that, we will need to have a substantively complete and accurate simulation treatment of weapon performance and safety that can be used with high confidence to help inform weapon assessments even after the directly-experienced design staff is gone. This in turn implies new physics and engineering models, in new three-dimensional simulation codesand roughly three to five years of effort to validate these new codes. The availability of extensive past nuclear test data for the systems in the continuing nuclear stockpile is crucial to this validation process, as will be new experimental data, but it will also be vital to engage individuals who designed and tested the current stockpileas well as new scientistsin the process. Given the time required for validation, to make the transition we have just described by 2010 we will need this "substantively complete and accurate" code capabilityand the platforms to run it on, and the visualization and communication systems to interact with itin hand by about 2004.

Stewardship's requirements place far greater demands on the physical and numerical accuracy of the simulation models, and on overall attention to scientific rigor. Three-dimensional codes are required because aging effects (and accident scenarios) will more often than not break symmetries designed into the weapon--and such changes will have to be assessed to know whether they require corrective action, either at the time or at some future date. (Similarly the as-built characteristics of newly manufactured components will be assessed, to ensure that weapon replacement does not cause unsuspected problems.) The codes must include improved physics and engineering models, since earlier approximations "tuned" to a specific region of parameter space, or set to produce agreement with certain data, will no longer be a sufficient basis for safety and reliability assurance now that nuclear tests are not available to check the code predictions. For example, more physically accurate models for high explosives initiation and burn, material behavior including spall and ejecta formation, and fluid hydrodynamic turbulence and instabilities are being developed through a combination of theoretical and experimental efforts, and techniques for modeling materials behavior simultaneously across many length and time scales are being developed.

Running predictive three-dimensional simulations with these more complete, accurate physics and engineering modelsand adequate spatial resolution to assess stockpile aging effectsin a reasonable time will require immense computational parameters because we will be handling about 100,000 times more information than today: roughly a billion zones, 100-teraOps peak operational speed, and 30 terabytes of computer memory. Were we to stagnate at today's levels of speed, for example, the kind of problem we will need to run in several hours or days would require centuries to complete. To achieve 100 teraOps-level capability by 2004 implies that we must exceed the rate of capability doubling (every 18 months) in the computer industry that is known as "Moore's Law". If this is to occur, ASCI must drive it, in concert with the industry.

These requirements drive the development of new hardware, software, and infrastructure capabilities to be developed jointly by the DOE weapons laboratories, the computer industry, and a number of universities. New numerical methods and algorithms will have to be developed, as will "terascale" visualization tools to help analyze, interpret, and interact with these massive amounts of information. The labs have to develop independent models and codes so that there can be effective peer review of weapon assessments, and each examines different hydrodynamic, transport, and other modeling methods to evaluate their suitability and efficiency. For example, both structured and unstructured grid methods are being used with alternative interprocessor communications strategies.

In working toward a 100-teraOps-level simulation capability, a series of hardware platforms are being developed and used in a phased-evolution approach, with different industrial sources and technologies. One such platform (the "option Red" Intel machine at Sandia National Laboratories) uses a massively parallel processor architecture, and achieved a speed of 1 teraflop on a benchmark problem in 1996. It is expected that to achieve 100-teraOps effectively will require a different architecture, that known as Shared-memory MultiProcessor or SMP, built on clusters of processors. As we write this, at Los Alamos the Blue Mountain machine by SGI/Cray is operational at 400 gigaOps while the Blue Pacific machine by IBM is operational at 920 gigaOps at Lawrence Livermore National Laboratory. Both are to achieve a sustained 3 teraOps in 1999 (plus a 1-teraOps companion to Blue Mountain for unclassified research). 10- and 30-teraOps platforms are planned for in 2000-01.

Peak operational speeds greater than 3 teraOps require software and hardware improvements for high bandwidth communication between processor clusters. A high performance communications corridor between the computer platform and the users is also needed, and access from distant weapons laboratories to these platforms requires high-speed encrypted networks. All are being pursuede.g., Sandia National Laboratories is leading an multilab-industry initiative to develop technology for secure gigabyte-per-second long-distance data transmission and communication.

We noted the engagement of the university community in ASCI. This takes two forms: individual collaborations on specific computer science and mathematics issues such as visualization or algorithms; and a set of "strategic alliances" which pursue unclassified, multi-disciplinary thrusts that involve appropriate science and help drive advances in areas of computer or computational science. There are five such "alliances": with Stanford, Cal Tech, University of Chicago,University of Utah/Salt Lake,and the University of Illinois, Urbana/Champaign.

Other than the record-breaking platform speeds, there are other markers of note from the ASCI effort so far. At Los Alamos, one code project has run a 3D hydrodynamic simulation with 60 million cells in parallel across 16 of the Blue Mountain SMP multiple-CPU boxesmore that 1000 processors. Another has demonstrated parallel execution of codes on all three architectures (the Blue Mountain and Blue Pacific SMP as well as Red MPP platforms), and linear scaling to over 3000 parallel processors on the Red platform. 3D calculations of nuclear one-point safety have been performed and cross-compared with a previous 3D code. A factor of 16,000 speedup in performing Monte Carlo simulations has been achieved relative to a couple of years ago--about half because of computer speed and half from improvements in algorithms. And ASCI tools have already contributed significantly to the ongoing revalidation of a stockpile weapon system, and to designing and interpreting experiments.

Stewardship And Other Complex Problems: Climate Prediction.
Stockpile stewardship is a challenge with certain characteristics. It is a problem of extreme technical complexity. Decisions of high consequencein this case whether the stockpile is safe and reliable, whether repairs are needed, or even if a nuclear test is necessarymust be made. How wisely they can be made depends in part on the quality of the complex technical assessments that inform them, since full-system experiments (nuclear tests) are not possible. The ability to perform more precise and predictive scientific simulations is therefore one key technical need. And there is some urgency to establish these capabilities--because both the stockpile and the experienced skill base are aging.

There are other problems of societal importance that share similar characteristics, including the inaccessibility of controlled full-system experiments, and the need for scientific simulations of great complexity. Global climate prediction, and the understanding of human-caused effects on climate, is an obvious example. There is general scientific consensus that continued unchecked growth of worldwide greenhouse emissions is unwise. Yet we still lack the capability to confidently assess the outcomes of specific human strategies on the climate, or the regional effects of possible climate change. It is apparent that simulation and computing capabilities comparable to those of ASCI are needed, and that synergism between these efforts will be valuable.

We make these comments guided by recent comparisons of global simulations with observation, and from a perspective of interaction with the climate-modeling community. Under the sponsorship of two DOE programsthe Climate Change Prediction and High-Performance Computing and Communications Programsour Laboratory has applied massively parallel computation to high-resolution simulations of the global ocean. Our colleagues are engaged in many collaborations in the community, for example with the National Center for Atmospheric Research (NCAR) on the development of a coupled climate model based on NCAR's atmospheric and land-surface models and the Los Alamos Parallel Ocean Program (POP) ocean and sea-ice models.

Great strides are being made by the community, yet present-day coupled climate models still are constrained to insufficient resolution and physical simplifications. Current global coupled climate models employ horizontal resolutions of about 3o0 with roughly 20 vertical levels in the atmosphere, and about 10 and 30 levels in the ocean. The resolution in the atmosphere (~300 km at the equator) is too coarse to evaluate regional climate effects, and the resolution in the ocean (~100 km) is too coarse to adequately resolve mesoscale eddies and western boundary currents such as the Gulf Stream. Even so, a century-long simulation with a model including component models for the atmosphere, ocean, sea ice, and land surface requires nearly a month of dedicated time on a Cray C90. Many such runs are needed to investigate various climate-forcing scenarios and parameter sensitivities.

A high-resolution simulation of the Atlantic Ocean recently completed at Los Alamos demonstrated, through comparison with measurements from the TOPEX/Poseidon satellite, that adequate resolution of mesoscale eddies and boundary currents requires about ten-fold finer grid spacing. The POP global-ocean model, using realistic bottom topography and observed surface winds, was run on a massively parallel CM-5 to extend such simulations to the highest resolution achieved. In global simulations, while mesoscale eddies and boundary currents like the Gulf Stream are somewhat resolved, there are discrepancies between the sea-surface height variability simulated by the model and satellite measurements.

Since higher resolution was not yet feasible on the global scale, to test the resolution requirements POP was used for a simulation limited to the Atlantic Ocean. Forty vertical sea levels were used; horizontal grid spacing was 0.1o0 (11 km at the equator). At this resolution, the behavior of the simulated Gulf Stream is much more nearly accurate: it separates from the coast at Cape Hatteras and includes a branch around the Grand Banks, both characteristics in agreement with observations. Furthermore the energy spectrum of the mesoscale eddies is much better resolved, as indicated by the close quantitative agreement between the simulated and measured sea-surface height variability.

From this and similar results one can estimate the computational scale needed to do adequately resolved simulations and to complete a century-long integration in a month of computing. A global ocean simulation at 0.1o0 and 40 levels would require about 0.25 terabytes of memory and a dedicated 10-teraOps platform. Increasing the resolution in the atmosphere by a similar factor would provide a grid spacing of ~40 km, sufficient to evaluate regional climate effectsanother important thresholdand would also require a dedicated 10-teraOps platform to do a century in a month, with about a terabyte of memory. Incorporating a more comprehensive treatment of physical processes in the atmosphere and ocean would increase these requirements further, as will coupling the atmosphere and ocean.

It appears that an integrated, global simulation at adequate resolution (regional in the atmosphere, resolving mesoscale eddies in the ocean) and more comprehensive physical treatments will require ~ 40-teraOps platforms and ~20 terabyte memory, a capability quite similar to that being sought under ASCI. A number of such runs to investigate various climate scenarios, model sensitivity, and natural variability would require about a year, bringing within reach ensembles of "what-if" century or multi-century simulations hypothesizing different futures of human activity. Such simulations could add greatly to the information and insight the scientific community can provide to society to further develop, and fine-tune, its response to the crucial question of global change.

Furthermore, other issues that ASCI must addressterabyte-scale information flow and visualization, high-speed communication, model validation, and understanding issues of predictabilitywill be similarly faced by such climate simulations. Aspects of the accelerated advancement of scientific and technical capabilities under ASCI, initiated to address one major policy issue, can benefit the scientific community's approach to other complex issues.

"Stewardship of the Nuclear Stockpile Under a Comprehensive Test Ban Treaty," in the April 1997 issue of Physics and Society

"Stewardship of the Nuclear Stockpile Under a Comprehensive Test Ban Treaty," in the April 1997 issue of Physics and Society

DOE report "Enhanced Surveillance Program: FY97 accomplishments" issued as LA-13363-PR, October 1997

J. D. Mahlman, "Uncertainties in Projections of Human-Caused Climate Warming," Science 278, 1416 (November 21, 1997)

M.E. Maltrud, R.D. Smith, A.J. Semtner, and R.C. Malone, "Global Eddy-Resolving Ocean Simulations Driven by 1985-1994 Atmospheric Winds," accepted for publication in Journal of Geophysical Research - Oceans.

L.-L. Fu and R. D. Smith, "Global Ocean Circulation from Satellite Altimetry and High-Resolution Computer Simulation," Bull. Am. Meteor. Soc. 77 (1996) pp. 2625-2636

R. D. Smith, M. E. Maltrud, M. W. Hecht, and F. O. Bryan, "Numerical Simulation of the North Atlantic Ocean at 1/100," in preparation

Philip D. Goldstone, Donald R. McCoy, and Andrew B. White are with the Los Alamos National Laboratory, Los Alamos NM 87545

pgoldstone@lanl.gov

Government Secrecy after the Cold War

Steven Aftergood

"Contrary to perhaps what is the most common belief about secrecy," Enrico Fermi once wrote, "secrecy was not started by generals, was not started by security officers, but was started by physicists."1
Actually, of course, there has always been a measure of secrecy in American government. Some secrecy may in fact be indispensable to the performance of certain government functions. But the physicists of the Manhattan Project helped create a secrecy system of unprecedented scope and impact, which eventually metastasized throughout the entire national security bureaucracy.

"One of the consequences of the depth and breadth of the active participation of many top US. academic scientists in this very secret wartime Project was that the subsequent peacetime control of scientific and technical information did not seem as unusual or unacceptable to those academic scientists as similar measures would have been prior to World War II."2

In the Cold War, some of the most fateful realms of government decision-making--matters of war and peace and life and death--were declared to be beyond democratic norms of public knowledge and debate. But as secrecy was bureaucratized, it also extended into more mundane areas. If scientists were partially responsible for the new secrecy system, they quickly came to rue the fact, as secrecy became increasingly burdensome.

"The story is told that in the days of the Manhattan District, a scientist was summoned to Washington and reprimanded for having mentioned in public a physical constant which was still secret. The accused fingered through the Smyth Report and pointed to a number there. 'Yes,' said the security officer, 'but that is in pounds per square inch, while you gave the figure in kilograms per square centimeter. Why make it easier for the Russians?'"3

Less amusingly, an internal security apparatus came to focus on scientists as a potential threat to the nation. The whole weight of the government's investigative bureaucracy was brought to bear on numerous individual scientists who, in the exercise of their constitutional freedom of expression, had caught the attention of wary security officers.

In a 1956 book called The Torment of Secrecy, sociologist Edward A. Shils wrote that "An official of the Federation of American Scientists, given to moderation in his judgments, estimates very tentatively that somewhere in the neighborhood of a thousand qualified scientists have encountered security difficulties."4 Similarly, Pugwash and other scientist-based organizations were singled out for official scrutiny, particularly during the 1950s and 1960s.

Throughout the decades of the Cold War, the secrecy system became ever more entrenched. By the 1980s, science and national security sometimes seemed to be on the verge of open conflict, as the Reagan Administration practiced an aggressive classification policy and even pressed for new limits on the dissemination of certain unclassified scientific information.
It is all the more remarkable, then, to observe that the entrenched secrecy bureaucracy has now been rolled back-- only partially, but to a real and measurable degree. The evidence that Cold War secrecy is in retreat includes the following:

  • In the last two years, an almost unimaginable 400 million pages of historically valuable documents have been declassified, according to the Information Security Oversight Office.5 This represents a significant dent in the estimated backlog of 1.8 billion pages of 25 year old documents awaiting declassification under President Clinton's executive order 12958.
  • New declassification programs have been initiated in the most secretive corners of the national security bureaucracy, including the Central Intelligence Agency, the National Security Agency, and the National Reconnaissance Office (the very existence of which was officially acknowledged only in 1992).
  • The size of the US. intelligence budget, an icon of secrecy for its own sake, was declassified last year for the first time in 50 years in response to a Freedom of Information Act lawsuit brought by the Federation of American Scientists.6
  • The creation of new secrets has reportedly "decreased to historic lows."7
  • A broad ranging Fundamental Classification Policy Review conducted by the Department of Energy (DOE) resulted in the declassification last year of some 70 categories of information previously restricted under the Atomic Energy Act. Since former Energy Secretary Hazel O'Leary undertook her "openness initiative" in 1993, DOE has declassified far more information than during the previous five decades combined.
  • An unprecedented quantity of government information is now easily available on the Internet, including vast resources on military and intelligence structures, functions, organizations, budgets and operations that previously would have been classified or very difficult to obtain.

All of this is impressive and quite novel. Nevertheless, one could conclude from another point of view that the glasnost is still half empty. Thus:

  • The fact that hundreds of millions of pages have been declassified does not necessarily mean that they are now accessible to researchers. Many are, but many others must still undergo a painstaking screening and review to address privacy and other concerns. The continuing eruption of declassified documents has generally overwhelmed the ability of archivists to process them for public access.
  • Many agencies are still complying half-heartedly-- or not at all-- with the declassification requirements of the President's executive order. The mandatory annual declassification quotas established by the President have not been met for the last couple of years by the Army, the CIA, and several other agencies. In the absence of effective oversight, there is no real incentive for compliance and no meaningful penalty for non-compliance.
  • Most agencies are also not in compliance with the Electronic Freedom of Information Act, which requires them to post certain information about the agency

Election Of Forum Officers

The election of officers proceeded smoothly. There were 470 votes cast via the Web, and 175 cast via paper ballots. Of the Web ballots, there were 22 duplicates (people who forgot that they'd voted weeks or months earlier),and 4 votes from non-Forum members. There were 10 people who voted both via paper and via the Web, and (unfortunately) the votes were generally different! This could have been a problem, since we hadn't stated which ballot would be discarded in this event. Fortunately, the results would not have changed whichever solution (keep the paper, keep the Web, keep the earlier of the two, keep the later) was adopted. Next year, the Web ballot will state explicitly that the Web ballot will not count if a paper ballot from the same person is received. Elected as vice-chair was Patricia Auchincloss, and the new executive committee members are Philip Goldstone and Steve Fetter.

FPS at the April APS/AAPT Meeting in Columbus, OH

About 1200 people attended the April "Washington Meeting" of the APA/AAPT, held in Columbus, OH this year; it's estimated that less than 100 of them were the students which this "wandering Washington" meeting was designed to attract. At the Forum Awards Session, Robert Park received the 1998 Joseph Burton-Forum Award "For 'telling it like it is' with his widely-read What's New and through other means on physics-related aspects of science and public policy issues", Howard Geller and David Goldstein were the 1998 Szilard Award recipients "For their significant contributions to enhancing efficient energy use, particularly for applying physics and economics to optimize energy-efficient appliance standards", and, unfortunately, no Forum APS-Fellow Awards were made. (This should be an urgent reminder to our readers to submit nominations for these Fellow Awards to our new Fellowship Chair -Priscilla Auchincloss; details can be found in the Awards Section of the APS Home Page.) The Forum-sponsored Symposium in Celebration of the 100th Anniversary of the Birth of Leo Szilard was very well attended whereas its sessions on Current Technology Transfer Issues and Current Issues in Arms Control, Energy Policy, and Science Policy drew disappointingly small audiences. Our readers can look forward to seeing some of the very good papers from these Forum Sessions in future issues of this journal.

At the the Forum's Executive Committee meeting, we learned that FPS membership has dropped from 12% of the APS to 11% in the last year; this is quite disconcerting, though such drops have occurred for all other Forums as well. We also learned that there are now 4726 subscriptions to this journal (including 500 foreign). There was considerable discussion as to how to deal with the costs of the 346 non-member-subscriptions which cost us about $780/year. It was decided to keep the library subscriptions at the current voluntary rate; the problem of individual-non-member subscriptions was left in the air, given the growing accessibility of our Web edition and uncertainty as to whether the Forum could enforce fees upon non-members. Finally, a great deal of discussion was devoted to the Centennial Meeting of the APS, to be held next March in Atlanta, and the anticipated roles of FPS in that huge and exciting meeting. We have been warned that accommodations might be hard to get so that anybody who does intend to go should get hotel reservations as early as possible.

Report on APS Council Meetings of November 23, 1997 and April 17, 1998

The APS Council met in San Francisco on Sunday, November 23, 1997 as part of the annual meeting of the APS Division of Fluid Dynamics; and in Columbus, Ohio, April 17, 1998 as part of the annual "Washington" General Meeting. The following reports and actions from those meetings may be of particular interest to members of the APS Forum on Physics and Society.
Major reports at these meetings focused on the economic health of science, physics, the APS, and APS publications. The economic health of science and physics was boosted in the last Congressional session as a result of a concerted effort by many scientific societies to persuade the White House and Congress of the value of science. The resulting Federal budget increase for science and technology was close to 7%. This exercise was so successful, that 110 scientific societies have now banded together in an effort to get Congress to double scientific funding over the next ten years. The budget prognosis for this year resembles an opaque crystal ball, but the societies are lobbying hard.

The American Physical Society itself is also healthy, with a ration of 1.4 in Net Assets to Annual Expenditures projected for the time after the Centennial celebration of next year.

APS publications make up 2/3 of the APS budget, and hence are always a major item of concern. The "electronification" of the APS publications is proceeding rapidly, only REVIEW OF MODERN PHYSICS and PHYSICAL REVIEW D are not yet on-line. A new purely on-line journal PHYSICAL REVIEW SPECIAL TOPICS - ACCELERATORS AND BEAMS (PRST-AB) will be launched this spring. Who knows, maybe even PHYSICS TODAY may someday go on-line? The electronic PHYSICAL REVIEW ONLINE ARCHIVE (PROLA) will soon be available - free until the end of 1998. Amuse yourself by using PROLA to find all mentions of your name in PR and PRL back to 1985. But U.S. submissions to APS journals are still decreasing at about 3% per year, although foreign submissions are making up for it. There will be a 7% increase in the cost of the bundled print/electronic journal package next year. In the long run the hope is to sustain APS activities from the income generated by APS Reserves, so that the journal endeavors only have to pay for themselves.

Work on the APS Centennial in Atlanta in March of 1999 is progressing. The cost to the APS is expected to be between $2.6 and #3 million. The preparations for the Centennial celebrations are already inducing reviews of the relationship of the APS to the public. A public relations firm has been hired, with Eugen Merzbacher as senior advisor. A media effort at the Los Angeles March meeting was quite successful in producing newspaper articles and op-ed pieces, TV spots, etc.

A membership survey done for the APS showed that members felt that the APS should be active in outreach - e.g. in education. Unfortunately the survey also showed that members were unaware of what the APS is actually doing in these areas.

The recent TIMSS report (Third International Mathematics and Science Study) showed that the performance of U.S. 12th-grade students in physics is rated the lowest of the 16 participating developed countries (not including Japan, whose students were too busy preparing for their national tests to take this examination). The Council's discussion of the reasons and cures for this problem was inconclusive. Might it be useful for the AAPT and APS to ask high-school teachers about the reasons for this problem? In the end the Council voted to authorize a statement about the current debate in California on physics school standards.

The APS Task Force on Prizes and Awards produced a recommendation that "APS units may allocate revenues from meetings for award travel expenses or to supplement prize and award funds." At the same time the Task Force recommended the goal that all major APS prizes should be raised to $10,000. It has recommended that no one should receive more than one major prize from APS units for the same work. And if more than one person shares an APS award, this sharing must be justified on grounds of "unique, original and indispensable contributions."

The APS Panel on Public Affairs (POPA) proposed a statement on "What is Science":

"Scientific results and theories have created new stores of knowledge, stirred public imagination, and brought great benefit to individual life and to human civilization. The American Physical Society wishes to affirm the crucial core values and principles of science. (1) Science is a disciplined quest to understand nature in all its aspects. It always demands testing of its scientific knowledge against empirical results. It encourages invention and improvement of reliable and valid methods for such testing. (2) Science demands open and complete exchange of ideas and data. Science cannot advance without it. Part of this exchange is to insist on reproducing, modifying, or falsifying results by independent observers. (3) Science demands an attitude of skepticism about its own tenets. It acknowledges that long-held beliefs may be proved wrong, and demands experimentation that may prove its theories false. This principle has built into science a mechanism for self-correction that is the foundation of its credibility. Scientists value other, complementary approaches to and methods of understanding nature. They do ask that if these alternatives are to be called "scientific," they adhere to the principles outlined above.......The American Physical Society stands ready to work with other scientific, engineering, and educational organizations, the media, and interested public bodies to define and communicate the principles and methods of science."

The resulting Council discussion showed a general sentiment supporting the idea of such a statement. But it also revealed comprehensive disagreement about the wording. The Council ultimately passed a motion that a revised statement is to be developed by POPA, taking into account various comments and criticisms. The revised statement is then to be discussed with other scientific societies for their input and for their possible agreement with the revised statement.

Inspired by the FPS success, the APS is heading toward electronic balloting. It is rewriting the constitution of various divisions and sections to allow such balloting.

Dietrich Schroeer, FPS Councilor

schroeer@physics.unc.edu

Washington News - Research Funding Dependent On Tobacco?

In a show of support for civilian R&D, President Clinton's FY 1999 budget request proposed an eight percent increase for programs in his Research Fund for America. The Fund includes research with the NSF, DOE, HHS, NASA, USDA, USGS, EPA, VA and the Depts. of Commerce and Education. Increases for the NSF and DOE were up 11 percent. Furthermore, Clinton's request projects a 32 percent increase for the Fund by the year 2003. Congress also seems to be favorable to significant increases in the budget for civilian R&D. Just before leaving town for a long spring break, the Senate passed an amendment introduced by Jeff Bingaman (D-NM) and cosponsored by Phil Gramm (R-TX) and Joseph Lieberman (D-CT) which states that "It is the sense of the Senate that the assumptions underlying the function totals in this budget resolution assume that expenditures for civilian science and technology programs in the Federal Budget will double over the period from FY 1998 to FY 2008". This is just a resolution, of course, since appropriations are passed year by year. Members of the House are also generally supportive; however the House Science Committee Chair, James Sensenbrenner (R-WI) called such legislation "very well-intended, but premature", stating that a long-term science policy must be in place before such large increases can occur (such a policy is under development by Vern Ehlers (R-MI)). He did say that with an established science policy, he would "absolutely" consider larger increases in R&D spending. For this coming fiscal year, however, it seems as if the House, Senate and White House are all supportive of significant increases in the NSF and DOE research budgets (of approximately 10%).

Unfortunately, the way the White House intends to pay for the increase seems to be from the expected tobacco settlement. Should this settlement not materialize, it is unclear what the source of the funds will be. At the hearing for the NIST budget, House Commerce Appropriations Chair Harold Rogers (R-KY) asked NIST Director Ray Kammer "suppose the tobacco funds don't show up"; Kammer responded that in his understanding, if a tobacco settlement was not reached, funds could be taken from the projected budget surplus". Rogers noted that the President had designated the surplus to go toward fixing Social Security, "so what are we to do?" At the NSF hearing, it was clear that there is widespread support for a 10-11% increase, but there are grave misgivings about the likelihood of using tobacco settlement money; subcommittee chair Jerry Lewis (R-CA) said that few people in Washington are confident that the settlement will occur, and asked what NSF's priorities would be if the increase was smaller. Lewis said NSF enjoys great support among Members of Congress on both sides of the aisle. What is beyond his control, that of NSF Director Lane, and the research community, is whether that tobacco money is going to materialize. Administration representatives echoed the statements of former Science Advisor John Gibbons that there was not a "one-to-one correlation" between science increases and tobacco money. He said "It is not the position of the White House that if we don't get a tobacco settlement, we don't get additional research funding" But if tobacco money doesn't contribute to the research increases, funds would have to come from the budget surplus, the breaking of last year's budget caps, or offsets from other discretionary programs. All of these options have political problems. (Note--this is being written in late April. By the time it appears in print, much more information will be available. It can be found at www.aip.org/enews/fyi/1998).

Communicating With Congress

The AIP has put out, via FYI (#71), a brief description of the most effective ways to correspond with Congress. They point out that most Members receive very little mail about science and technology issues, and that well-written letters from scientists can make a big difference. They suggested the following guidelines:

TIMING: A letter sent months before an issue is considered is likely to be forgotten; one sent after congressional action is wasted.

BREVITY: Legislative staffs are severely overworked. Limit your letter to one page and one subject. Resist the temptation to include many enclosures--they will, in all likelihood, not be read.

SCIENTIFIC JARGON: Most Members and their staffs freely admit that they know little about most scientific issues. As appropriate, use a few sentences to offer a non-technical overview

YOUR IDENTITY: Ensure that you letter is legible by typing it. Include your name, home address and telephone number.

E-MAIL: While some congressional offices are equipped to handle E-mail, others are not. The safest course is to "snail" mail it.

FAXES: Many offices resent a fax campaign--it clogs their machines and uses their paper. Look next to any congressional fax machine and you will find a congressional waste basket. Unless there are severe time constraints, avoid faxing.

BE SPECIFIC: Congressional offices revolve around legislation. If there is a bill number, cite it. If you do not know it, or if the bill has not been introduced, be specific: " I write about the FY 1999 appropriation for ..." Check the AIP web site (http://www.aip.org/gov/) under "Budget Information".

USE THREE PARAGRAPHS: Paragraph 1: Explain your reason for writing. Briefly describe you "credentials" or experience. Paragraph 2: Describe the importance of the issue. Cite relevant facts and avoid emotionalism. Frame your discussion from a national, rather than a personal perspective. Paragraph 3: Request--not demand--a specific action. Thank the Member for his/her consideration of your views. Offer assistance.

ADDRESS STYLE: The post office prefers that you do not use office numbers. The correct address style is

The Honorable _________

United State House of Representative

Washington DC 20515

Dear Representative________

(For the Senate, the zip is 20501)

Doe Secretary Pena Resigns

DOE Secretary Federico Pena will leave the Dept. of Energy on June 30th. In a hastily called news conference, Pena announced his resignation "for personal and family reasons" saying that he and his wife "have three wonderful children, and it is now time for us to focus on their futures". Pena has been at the DOE for about one year, having formerly served as Secretary of Transportation. He said that Deputy Energy Secretary Elizabeth Moler was on the short list of replacements (at the time of this writing, no replacement has been announced). Pena said that his replacement's biggest challenge would be nuclear waste disposal. A detailed statement from Pena, dealing with issues such as Brookhaven, the NIF, global climate change, the CTBT, etc., can be found at the AIP site, under FYI #52.

Russian Agencies Denied Aid

The State Department has declared 20 Russian agencies and research facilities ineligible to receive millions of dollars in US government assistance because they may have provided missile technology to Iran. The list of agencies was sent in March to managers of American programs that finance commercial ventures for Russian institutions formerly involved in weapons work. Since then, funding has been denied for at least three Russian projects because they were on the list, and future such projects will also be denied funding. Despite longstanding concerns about the exodus of weapons technology from Russia to rogue states, the move marks the first time specific institutions have been penalized. The list includes Russian institutions ranging from universities to government agencies.

Among the projects denied funding: A proposal from Baltic State Technical University in St. Petersburg to apply rocket motor technology in the high-T destruction of chemical wastes; a project using aerospace technology to develop high-tech plastic joints for industry; a project led by the Moscow Aviation Institute to develop new methods for evaluating the thermal properties of composites.

Global Warning (A) Won't Happen And (B) Is Good For You

In April, many physicists received a petition card in the mail opposing the Kyoto global-warning according. It included a note from former APS president Fred Seitz (1961) and an 8 page "paper" explaining why carbon dioxide is good. It explains how burning hydrocarbons "moves them from below the ground and turns them into living things...No other single technological factor is more important to the quality, length and quantity of human life than the expanded and unrationed use of hydrocarbons". The article was in a format identical to that of Proceedings of the National Academy of Sciences articles, although it had never been published (or even submitted). Some members of Congress assumed that the NAS supported the petition, leading the Academy, on April 24, to take the unusual step of disavowing any link to the petition, saying "The petition does not reflect the conclusions of expert reports of the Academy". The author said that the style of the PNAS was used because "he liked the way it looked".

A few days later, the New York Times reported that industrial opponents of the treaty (including Exxon, Chevron, Southern and others) have drafted a multi-million dollar proposal to recruit a cadre of scientists who share their environmental views and to train them in public relations. They wish to provide "a one-stop resource on climate science for members of Congress, the media, industry and all others concerned" and to develop "sound scientific alternatives" to the Intergovenmental Panel on Climate Change. They call for recruiting scientists to argue against the Administration's views on climate change. One of the scientists mentioned is Fred Seitz.

CTBT

Last April, the Council of the American Physical Society unanimously passed a statement urging the Senate to ratify the Comprehensive Test Ban Treaty. Starting last November, the APS Washington office began addressing the issue: organizing grassroots teams in key states, assembling briefing packets, marshaling Nobel laureates, etc.

Unfortunately, the Treaty sits in legislative limbo. Senator Helms opposes the CTBT, and he is blocking it in his Foreign Relations Committee. Senator Lott promised the Republican Caucus that he would not let CTBT come up for a vote this year. The Administration is currently occupied with other foreign relations issues.

U.S. Students Near Bottom In Math And Science

The Third International Mathematics and Science Study (TIMSS), the "largest, most comprehensive, and most rigorous international study of schools and student achievement ever conducted", recently reported its results for students in their last year of secondary school (twelfth grade in the U.S.). The performance of American students was compared with those of 20 other countries in general math and science, and with 15 other countries in advanced math and physics. In general math and science, American students outperformed only two countries, Cyprus and South Africa; in advanced math and physics, only the same two countries were outperformed. Looking at just physics, TIMSS found that the US didn't outperform any other country. NSF Director Neal Lane, noting that results for fourth-graders were more favorable said that "the majority of twelfth-graders tested in 1995 would not have been exposed to the initiation in the early 90's of math and science education reform efforts across the country". The results for the fourth, eighth and twelfth grade testing can be found at http://www.nces.ed.gov/timss/

Therapeutic Touch

Therapeutic touch (TT) has 40,000 practitioners in North America who can "palpably sense an energy field that extends 10 centimeters beyond the surface of the skin, and can manually smooth the field". It is endorse by major nursing organizations and offered by 70 hospitals in the U.S. For two decades, TT therapists have concentrated on which diseases can be treated, before first demonstrating that such a field, if it exists, can be detected. The James Randi Educational Foundation has offered a million dollars to anyone who can demonstrate that they can detect such an energy field. Now, in a beautifully simple experiment designed as a science fair project, a 9-year girl (Emily Rosa, now 11), persuaded 21 TT practitioners to submit to a simple test in which they were asked (without looking) which of their hands was near Emily's hand. In 280 trials, they scored 44% (random chance would give 50%). The paper was published in the Journal of the American Medical Association, making Emily one of the youngest (the youngest?) authors ever of a scientific paper. It thus appears that there is nothing for therapeutic touch practitioners to touch.

Changes In Graduate Education

The House Science Committee recently held a hearing, aimed to provide input to the National Science Policy Study, on the progress made in graduate science and engineering education in providing students for a wider choice of careers. The Committee Chair, physicist and Congressman Vern Ehlers, said that "the era of perpetual expansion of the academic enterprise is over, yet we continue to train scientists at the same rate and in the same way---which is to focus them on careers in academia". Testimony indicated that graduate programs in engineering have adapted more effectively than those in the sciences.

David Goodstein, Vice-Provost of CalTech, noted that while the fraction of undergrads continuing on to graduate school in physics has declined, the influx of foreign grad students has allowed us to pretend nothing's changed. He called the U.S. system a "mining and sorting operation that selects the best and discards the rest". This results in a surplus of highly-trained Ph.D. and scientific illiteracy in everyone else. A student from Johns Hopkins, Catharine Johnson, added that "the system is designed to replenish the ranks of academia...and is too narrowly focused on training specialists in a market that increasing needs generalists". Philip Griffiths, Director of the IAS, said that the system "promulgates a misalignment between how graduate schools train students and what employers are seeking--skills in communication and teamwork, experience in applied and multidisciplinary research, and adaptability". He blamed the misalignment on the way graduate education is funded: research grants to faculty members support most grad students, but force them to work on specific projects. He praised the NSF for offering a variety of new and interesting programs to give graduate students more varied experiences. When the committee members questioned whether anything should be done about the number of foreign graduate students, none disagreed with the statement that "when international students don't want to come to the U.S. anymore, that's when we have to worry."

Commission On The Advancement Of Women In Science

Rep. Connie Morella of Maryland, chair of the Technology Subcommittee of the House Committee on Science, has introduced legislation (HR3007) called "the Advancement of Women in Science, Engineering and Technology Development Act", which establishes a commission to study what factors have contributed to the relative lack of women in science, engineering and technology, and to issue findings and recommendations to improve practices related to recruiting, retaining and advancing women scientists and engineers. On March 10th, the Association for Women in Science (AWIS) testified in support of this legislation The bill was marked up in the Technology subcommittee. According to a committee staff member, "The subcommittee marked up a bill establishing a commission.... However, the mention in the bill of the need to promote workforce diversity, the problem of under-representation of women in these fields compared to men, and other specific mentions of the problems the commission was to study caused problems with conservative Members of the subcommittee so these problems were struck from the bill. We now have a commission studying women in science, engineering and technology development that never states what the problem to be studied is. This should not be a difficult report to write...." It is quite possible that the bill will be strengthened in the full committee. The bill can be found at http://thomas.loc.gov by entering HR3007 into the 'search bill number slot'. The testimony of AWIS and others can be found at http://www.house.gov/science/hearing.html#Technology

Westinghouse Science Talent Search Now The Intel Science Talent Search

In the last issue, we reported the possible demise of the Westinghouse Science Talent Search, a 57-year-old institution often considered the "Nobel Prize" competition for U.S. high school seniors. A new title sponsor, the Intel Corporation, has been found. The Intel Science Talent Search is administered by Science Service, a nonprofit organization that promotes public understanding and appreciation of science. Craig Barrett, Intel's president , commented "as scientists, we understand the importance of the Science Talent Search--the importance of challenging our students to come up with the next inventions, the next great ideas, the next important scientific discoveries. It is critical to encourage students' spirit of discovery and enthusiasm from their earliest days in school".

The United States and Arms Control: The Challenge of Leadership

Allan S. Krass, Praeger Publishers, Westport, CT, 1997, ISBN 0-275-95947-3

Although the pace of arms control has slowed since 1993, there has been some good news: The weapons reductions under START, the indefinite extension of the Nonproliferation Treaty (NPT), the Russian and U.S. ratification of the Chemical Weapons Convention, and the agreement of the five nuclear weapons states to sign a Comprehensive Test Ban Treaty (CTBT) banning all nuclear tests. But beyond these bright spots there is a sense that both the Congress and the Duma have other agendas on their minds. Krass's book comes at a good time to help us sort our way through the reasons for inaction.

The United States And Arms Control is organized around several themes: A history of arms control during 1946-86, developments since 1986, the U.S. arms control bureaucracy and its problems, costs and benefits of implementing treaties, persistent noncompliance problems, implications for the stability of existing arms control regimes, and prospects for new agreements.

The first four decades of arms control treaties did not have on-site inspections (OSIs) but instead relied on national technical means, such as satellites, to determine compliance. The book covers all treaties, but for brevity we will confine ourselves to nuclear strategic arms treaties, nuclear testing treaties, and the NPT. Krass emphasizes the implementation aspects of the treaties, a topic usually avoided because of its mind-bending complexity.

First the strategic treaties: The 1987 Intermediate Nuclear Forces (INF) Treaty paved the way for on-site inspections, allowing inspectors to be more intrusive. As part of this increased scrutiny, the criteria for the quality of verification changed from "adequate" to "effective." During INF Treaty ratification, Ambassador Paul Nitze defined "effective" by stating: "...if the other side moves beyond the limits of the Treaty in any military significant way, we would be able to detect such violation in time to respond effectively and thereby deny the other side the benefit of the violation." During the ratification of START, Secretary of State James Baker added "the verification regime should enable us to detect patterns of marginal violations that do not present immediate risk to the U.S. security." Of course, more verification can lead to more trust, leading to more verification, and so on. At some point the U.S. and Russia must determine how much verification is enough. If one calculates "draw-down curves" to determine the survivable forces, it is clear that the U.S. nuclear force will survive. Nobody seriously disputes the invulnerability of the Trident force.

But having cops on the beat sets a tone for safer streets. In my opinion, the political atmosphere pushes us towards overly complex verification regimes that dampen possibilities for further cuts. Krass's earlier book on verification nicely discusses this issue. There are wise verification measures and their are superfluous ones. An excellent one in the START treaty is the re-entry vehicle OSI. The idea of actually counting the number of warheads on SS-18s and Peacekeepers was and is a great breakthrough. If, indeed, we are to go to lower numbers of warheads, this will be the most important compliance tool.

Nuclear Testing Treaties: The expensive Corrtex measurements for the Threshold Test Ban Treaty (TTBT) allowed the U.S. and the former Soviet Union to calibrate their opponent's test sites for tests at 150 kilotons. This act was more political than technical, since the charges of "likely violation" of the TTBT were known to be untrue by the seismologists and others--a fact that the Reagan policy community didn't want to believe. With the 1996 signing of the CTBT at the U.N., new verification and implementation tasks will need to be fulfilled. Most understand that the CTBT is both an arms control treaty and a non-proliferation treaty. But implementation will be difficult because of the requirement for 44 nations to ratify and because of the complex data gathering and analysis. Nonetheless, it is worth this hassle to strengthen the NPT regime, as many non-nuclear weapons states see the NPT regime favoring the five nuclear-weapons states.

Nuclear Nonproliferation: The NPT was indefinitely renewed in 1995. As part of the bargain, the nuclear weapon states stated they will commit to the CTBT. These five nations also gave assurances not to use nuclear weapons against non-nuclear states, which we now know has some possible exceptions for some actions against some chemical weapon states. The International Atomic Energy Agency cannot be blamed for the failure to detect Iraq's nuclear program since the IAEA inspectors were confined to declared sites. However, in the wake of the 1991 findings on Iraq, the IAEA has instituted challenge inspections, inspections of North Korean nuclear facilities, and environmental inspections looking for clues. All these new tasks, plus the increasing number of facilities in the former Soviet Union and elsewhere, makes for a tight IAEA safeguards budget. The challenge will be to save money by shifting the more mundane inspection tasks to automated technology.

The "Bureaucratic Evolution" of the U.S. arms control process: The wiring diagrams between the various U.S. leadership bureaus is complicated greatly by the many component parts of each agency. The dynamic seems "anti-synergistic," because the whole can be less than the sum of the parts. Of course, one needs "effective" verification, but the verifiers often raise arguments against a provision that goes beyond mere logic. By and large, the process is made up of good people, but then with a clever flip of the wrist the issue is stalled. Only with a hard-driving NSC that uses a science-court approach of hard-hitting questions and answers is there going to be a smoother and happier process. Krass lays out the internal process of data exchanges, notifications, inspections, analysis, compliance decisions, and more.

Estimated costs of arms control treaties: The General Accounting Office estimated inspection costs for the INF Treaty at $7.5 M, and for the two continuous perimeter-portal monitoring at Magna and Votkinsk at $12 M per year. These costs are trivial when considering that the former Soviet Union destroyed 1,846 missiles and the U.S. destroyed 846 missiles. These and other costs will ultimately add up to a billion dollars, and are certainly much less than the annual savings of more than $100 billion from lowered post-Cold-War defense budgets.

I found this book to be both a welcome analysis of the current crop of arms control treaties, and some of the troubled implementation areas that will probably constrain arms control in the future. Only by getting into the pesky details on treaty implementation can we learn to overcome the pitfalls of the past. The book is well written, even lively for such a drab, but important subject. If The United States And Arms Control comes out in a paperback version, I will use it as a required text on arms control matters. If it remains expensive, I will assign it as library reading.

David Hafemeister

Physics Department

California Polytechnic State University