Archived Newsletters

Letters

The letters pages are dedicated to free expression on societal topics of interest to the physics community. As a forum for all physicists we welcome all views, but of course the Forum on Physics and Society does not necessarily endorse any particular view found in these pages. Readers are most heartily invited to respond to letters, comments, or others items in Physics & Society.

Against Fiction in P&S

Your decision to publish the fps via the web is probably a good one since it will be much cheaper and you will therefore be able to include more items. You might also include a table of contents for the solely electronic issues in the following hardcopy issue so that it is a matter of the written record just what was "published". Your decision to include fiction strikes me as extremely unwise. As I see it the fps is a serious discussion and commentary about important questions confronting our society. These questions (e.g. global warming, nuclear stockpiling, SDI, decontamination of former nuclear materials sites, etc.) will involve billions of dollars, questions of public health and safety and questions of national security. As physicists we are uniquely equipped to deal in hard facts as well as informed opinions. Thus we are qualified for example to be cited by say, the New York Times, by congressional reports or by other first-rate sources for the general public. Any respectable journal, which mixes in fictional material, will inevitably be viewed less seriously than one which does not. When I read fictional material about these important questions, I tend to believe that I am being fed pure propaganda. Then the next step is to wonder just how much propaganda is contained in the rest of the journal. A mix of fiction and what purports to be fact produces a slightly hysterical tone, reminiscent of many underground publications which we all saw during the late nineteen sixties. What you are doing is degrading the status of the fps in the eyes of the general reader. Indeed, one then wonders just how much of the supposedly serious articles contain some fiction or fantasy.

Before you conclude that the inclusion of fiction should be a policy of fps, I hope you will call for a written ballot by the full membership of fps. This is a major policy shift and I cannot believe that a single person is making it unilaterally!

Carl Iddings

Editor’s Comment

As our readers should know by now, there were two innovations in our October issue - its publication exclusively on the web and its inclusion of fiction. Neither "policy shift" was was made "unilaterally by the editor - I was initially against the first and dubious about the second. The first was made by the Forum’s Exec Board, the second suggested by the Forum Chair and concurred in by the Editorial Board and my editorial colleagues. So far, we have had a number of comments from our readers on the first shift, only the one above on the second. Most comments have been laudatory -the only common complaint being that it required too many steps to print out the entire issue, section by section. (We have taken steps to eliminate that problem - there is now a "one-step button".) Counter to my expectations, there was only one complaint about the absence of a hard-copy issue - most, if not all, of our members apparently have ready access to a web-capable computer. The only difficulty I anticipate now is the possible loss of non-subscriber readers who ordinarily get to us via library shelf browsing. I hope to alleviate this problem by incorporating the Table of Contents of the web editions into the following paper editions; library browsers can then use the library web access to read those previous web issue pieces to which they have become alerted by the library hard copy. As always, we welcome comments and suggestions from our readers - on these shifts as well as other Forum related matters. Physics is an "experimental science"; experiments are only experiments if they are continuously evaluated. We hope for no less with our physicist’s journal.

Al Saperstein

Demarcation Between Science and Non-Science

I have to agree with William Butler and disagree with Derek Walton regarding the ability to distinguish between science and non-science. Despite what any philosophers of science might claim to the contrary, I think there is a clear and widely accepted distinction between scientific claims and non-scientific ones.
Simply put, any scientific statement must be testable, and thereby falsifiable. A good scientific theory can never actually be proven, but it can certainly be disproven. Therefore scientific theories must continuously evolve to conform to new evidence as it arises. It is a hallmark of non-scientific theories such as "scientific" creationism to maintain the theory intact, and develop contorted "explanations" to "invalidate" the evidence.

Consider the statement, "God created the universe 'as is' 6000 years ago." There is no evidence that can possibly disprove this statement. "As is" presumably means complete with fossils in the ground, light in transit from stars billions of light years away, and all the other physical evidence that points to a much older universe. One can make no predictions of what will be discovered (or, more importantly, what can be ruled out) as a result of this hypothesis. The statement is not scientific. On the other hand, if we take the statement, "Life on earth evolved over 4 billion years through a continuous process of random mutation and natural selection, " there are certainly pieces of evidence that we can imagine that would render the statement false. In fact, while our understanding of the sequence and mechanisms of evolution are by no means complete, and have been continually improved and modified since the hypothesis was first proposed, all of the evidence gathered to date supports some variation of this evolutionary theory. The statement is scientific because it leads to testable predictions, and the fact that the available evidence does not violate any of those predictions means that it continues to be a viable theory.

Nothing prevents non-scientists, pseudo-scientists, charlatans and frauds from making scientific claims. However, if such claims are truly scientific they will be testable. In many cases such claims have already been tested and found lacking, in which case it becomes incumbent upon us to present the evidence and expose the errors in the claims. Homeopathic medicine practitioners have been claiming for years that there is therapeutic value in taking doses of poisonous substances that have been diluted to concentrations that are, in some cases, less than one molecule in the world. This clearly scientific claim is easily testable, though for some reason there seems to be little interest in carrying out such tests.

False scientific claims and theories provide no difficulty to science. They are easily dismissed if they have no usefulness, or maintained as convenient models if they do. Newton's mechanics, after all, has been known for a century to be false, but it remains such a useful approximation for so many problems that it continues to be more widely taught than the more accurate relativistic and quantum mechanical theories. The true test of a scientific theory may be not truth, but rather utility. No, the greatest difficulty comes when people make non-scientific claims, and then insist (through the political process) that those claims be treated on an equal footing with scientific ones. The average person, who is unfortunately inclined to be at best marginally scientifically literate, is far too easily swayed by argument that one cannot prove that evolution is right and creationism is wrong. In fact, we cannot prove such a claim, but that does not mean that the two theories are on equal footing. As any freshman philosophy student can tell you, we can no more prove that the universe was not created fresh this morning when we got out of bed, or in fact that the universe exists at all. As scientists, though, we must begin by agreeing that measurements, even in quantum mechanics, provide us with information about reality. It is certainly true that scientists do not agree on an exact statement of the scientific method. To some degree this is because the exact methods differ between disciplines. A field such as astronomy, that relies heavily on "found" evidence (observing what is there without the ability to manipulate the conditions) must necessarily proceed along different lines than a field such as solid state physics, where most conditions can be fully controlled within a laboratory environment. Nevertheless, even without a universally accepted definition of the term, I think all scientists share a common understanding of what constitutes science and what does not. Given a list of statements, I think the vast majority of scientists would agree on which were scientific and which were not scientific, even though they may disagree on the merits of the statements. I think much of the confusion arises because too often no attempt is made to distinguish between bad science and pseudo-science.

Bad science is when people make scientific claims that are contradictory to the facts. This can occur by mistake, or it can be the result of a deliberate attempt to mislead. In either case, it is relatively easy to uncover (though history indicates that uncovering the error does not necessarily stop people from continuing to make the claim). Pseudo-science is when people take non-scientific claims and try to pass them off as scientific statements. Such pseudo-scientific claims cannot be disproven, and the best that can be accomplished is to convince people that they are simply non-scientific statements of belief without supporting evidence.

Dr. Scott C. Smith
Mailstop 532-2 Lockheed Martin NE&SS 199 Borton Landing Road
Moorestown, NJ 08057-0927
Phone: 856-787-3656 Fax: 856-787-3344

The APS in an Age of Litigation

Harry Lustig

Read as an invited paper at the March 2000 meeting of the American Physical Society in a session "Physics and the Law" arranged by the Forum on Physics and Society. Most of the material is adapted from the author’s article in the American Journal of Physics

For the first eighty-seven years of its existence, the APS had never been involved in a lawsuit. Since 1987, the Society defended itself in the courts in four cases. All of the litigation was connected with publishing. Two plaintiffs sued the Society for not publishing their work, one sought to punish APS for publishing an author’s work, and in one case, a third party attempted to compel the Society to reveal the identity of a referee of a manuscript that was not accepted for publication.

In spite of the fact that the Physical Review almost certainly provides more opportunities to appeal the recommendations of referees and the decisions of editors than any other journal, there have been aggrieved authors before and since. A few have turned nasty--for example, editors and the Society were accused of being part of a Jewish conspiracy to prevent the dissemination of a refutation of Einstein’s relativity. Some have implicitly or explicitly threatened lawsuits, but before 1987, none of these threats ever materialized.

In 1987, the APS was notified that it was the defendant in a federal suit seeking damages of $44,500,000. The plaintiff, who had very limited training in physics, alleged that the Society had deprived her of seven Nobel prizes by rejecting a manuscript that she had submitted to the Physical Review. She claimed that her invention, the "Qaddafi Field", encompassed and replaced all the fundamental laws of physics. (She is also reported to have said the APS had tried to poison her coffee and had killed Physical Review D editor Dennis Nordstrom for having promised to accept her manuscript. Her estranged husband reported that she was carrying a gun.) The courts were able to deal swiftly with the case, although the plaintiff, who represented herself, did try to pursue her complaint all the way to review by the Supreme Court. This bizarre and depressing case had perhaps one redeeming aspect: it showed that every citizen who feels aggrieved can access our judicial system.

The other instance in which the APS was accused of improperly refusing to publish an article presented quite a different issue. Physical Review A had accepted an article by a University of Maryland research associate, Thomas Kiess, for publication. When his colleagues found out about it through prepublication of the abstract, they complained that the work had been done collaboratively with them, and that publishing the article without recognition of them would be improper. The editors of the Physical Review decided to suspend publication unless and until the matter could be resolved. Kiess thereupon sued his collaborators and the APS, claiming that the refusal by the Physical Review to proceed with the publication of his manuscript after he had received notification of its acceptance, constituted a breach of contract . The APS responded that there was no contract, that the Physical Review’s letter of acceptance contained conditions that were never satisfied, and that even if the parties had created a contract, there was no breach because the journal had not refused to publish the manuscript. The court agreed and dismissed the case against the APS.

Again in 1987, the APS was served with a subpoena by attorneys for Arco Solar, the defendant in a patent infringement case brought by Solarex Corporation and RCA Corporation. Arco Solar sought the name of a referee for a manuscript that described results that could improve amorphous silicon p-n junctions. The manuscript was submitted to Physical Review Letters but was rejected for publication based on the strength of one of the two referees’ reports. The report requested that more experiments were required and that the article was better suited for the Physical Review. The other referee had recommended publication and had revealed his name and his action to the authors. As an element of its defense strategy, Arco Solar wanted to maintain that the submission constituted "prior art" which invalidated the patent at issue. The idea was to try to claim that the negative referee had disclosed the new process to colleagues and, perhaps, to other competitors. In order to pursue this theory, Arco Solar needed to know the identity of the referee.

While willing to turn over the referee’s opinion and all other documents connected with the case, the APS remained true to the common practice in scientific publishing and declined to reveal the name of the referee. APS’s attorney, Richard A. Meserve, argued that non-disclosure was essential to the preservation of the peer-review process. (Meserve is a physicist and was a lawyer until late 1999 when he became head of the Nuclear Regulatory Commission. He has successfully represented the Society in all the cases described here.) The US Magistrate who heard the case found unequivocally in favor of APS, observing that "...the Society had demonstrated a strong interest in preserving the confidentiality of its reviewer’s identity", and the District Court and the Court of Appeals affirmed the ruling. The decision, while not establishing an absolute right to keep the names of referees confidential, nevertheless provided a presumptive precedent for it. David Lazarus, APS’s quondam editor-in-chief, had the satisfaction of being able to report on theoutcome in Physics Today.

The most vexing litigation against APS, directed against AIP and an individual APS member, by the Gordon & Breach publishing group and its several companies, has, at this writing, for more than a decade. It has its origins in two articles by H. H. (Heinz) Barschall of the University of Wisconsin, one written with a collaborator, which reported the cost per printed character, the impact, and the cost-effectiveness of more than 200 physics journals from nearly sixty publishers . The cost per printed character (perhaps better "price per printed character") is the amount paid by an American library for an annual subscription to a journal divided by the total number of characters published in that journal during the year. (In order to neutralize the effect of variations in typography and page size, Barschall used characters rather than pages, in the denominator.) The impact is the average number of times that articles from a journal were cited during the two years following their publication, as determined by the Institute for Scientific Information. It was Barschall’s innovation also to obtain and report the quotient of the price per character divided by the impact, resulting in the cost-effectiveness.

Barschall’s study showed that the Gordon & Breach journals were by far the most expensive and least cost- effective in the survey - not only when compared to those of not-for-profit societies such as APS and AIP, but also to those of other commercial publishers. This result was not explicity pointed out in the articles, however, readers could calculate that there was a factor of eighty difference between the journal with the highest price per character on the list, Gordon & Breach’s Physics and Chemistry of Liquids, and that with the lowest, the American Astronomical Society’s Astrophysical Journal. With the impact figured in, the cost-effectiveness difference between these two journals became a factor of 409.

The officers of the two societies and Barschall were surprised to receive complaints from lawyers for Gordon & Breach against the articles and a demand, under threat of litigation, for the publication of a prescribed retraction and an apology. Barschall and the societies checked the allegations of error and concluded that they were either without foundation or insignificant. Nevertheless, Physics Today offered Gordon & Breach space for a statement setting out its objections to the Barschall articles, subject only to allowing the author to respond to allegations of error. This offer was summarily rejected. This offer for resolving the complaint was maintained by APS and AIP during the many years of the ensuing litigation.

As to the threats of a lawsuit, the officers of the societies found it hard to believe that the accuracy of Barschall’s study and the action of the societies in publishing it would even be considered by the courts on their merits. Wasn’t there a constitutionally guaranteed right of free speech and publication? In 1989, when suits were launched in Germany, Switzerland, and France against Barschall, APS, and AIP, the defendants had to contend with the fact that free speech protection was not as strong in those countries as in the United States. "Unfair competition" laws significantly restricted the right, even truthfully and accurately, to compare the prices and quality of products. The suits demanded retractions and the publication of prescribed apologies in the Bulletin and in Physics Today, injunctions against the publication of any further studies by Barschall or by anyone applying similar methodologies, and damages.

Even in the face of the strict laws, but at great effort and expense, Barschall and the Societies won their cases in Germany (as early as 1991 after unsuccessful appeals by Gordon & Breach all the way to the federal supreme court [Bundesgericht]) and in Switzerland. In France, in the face of a law that Gordon & Breach would interpret to forbid any comparison of the prices of products that are not completely identical, the trial court found against the defendants. After several see-saw actions, the Court D’Appel, on 21 June 2000, dismissed the G&B complaints against the societies and Barschall’s heirs (Barschall had died on February 4, 1997) one by one, finding that the articles were neither denigrating, misleading, nor in other way in violation of French law. At this writing is not clear whether G&B will be allowed to file yet another appeal before the Court de Cassation.

APS and AIP had another unwelcome surprise when, on September 23, 1993, Gordon & Breach filed suit in the United States under a statute called the Lanham Act, which regulates advertising, stating that "commercial speech" must not be false or misleading. The pursuit of the litigation in the US exceeded, in effort and cost, even the burdensome activities in Europe. Dozens of briefs and counter-briefs were filed and motions were made; hundreds of documents were produced, and more than a score of witnesses were examined by deposition or at trial.
In August of 1994 the Federal District Court for the Southern District of New York issued its first substantive decision. He dismissed the Lanham Act claims against Physics Today and the Bulletin because, as scholarly publications, they enjoyed the protection of the First Amendment to the US Constitution. The matter did not rest there, because Gordon & Breach asserted that "secondary uses" (which included the distribution of a draft to some librarians at a convention and the favorable mention of the Barschall findings in the 1988 annual presidential letter to the membership by APS President Val Fitch, which included the phrase "tell your librarian about it" ) had been made of the articles, and the judge ruled that these uses of the survey should be subject to examination under the Lanham Act..

The ruling opened the way to "discovery" of documents from APS’s, AIP’s and Barschall’s files, which were used by the plaintiffs to try to make a case that the research and publication of Barschall’s findings resulted from a conspiracy between the authors and the societies to hide the alleged commercial intent of the undertaking. The allegation of a conspiracy was based largely on the fact that before the initiation of the survey there had been some consultation between Barschall and APS and AIP officials about whether a survey would be useful, and while it was in progress, some correspondence about Barschall’s work. During one of these exchanges the APS treasurer made the suggestion that Barschall might explicitly mention the favorable results for the societies’ journals. The suggestion was rejected.

The reason put forward by Gordon & Breach for the societies’ allegedly enlisting Barschall in a marketing effort was their need to maintain library subscriptions in the face of sharply increased prices. While it is true that prices had to be raised, in the face of declining numbers of subscriptions and of income from page charges, most of the increase was necessitated by the steep increases in the numbers of articles submitted and published. The societies clearly welcomed the news that, in spite of the increases, their journals were still extremely cost-effective. Some in AIP and APS did see an advantage in publicizing the results among librarians while others saw their value in justifying the painful price increases that were necessary to keep the journals solvent.

After a seven day trial in June of 1997, Barschall’s methodology was deemed sound and the results free from errors. The court also found that the Gordon & Breach complaint that Barschall did not examine or report on the costs to them of producing their journals was irrelevant. More significantly, the court gave credence to the societies’ discovery, during the long years of litigation, that "the evidence persuasively demonstrated that the present suit is but one battle in a ‘global campaign by G&B to suppress all adverse comments upon its Journals’ ". APS and AIP were able to document ten instances of (mostly successful) attempts at intimidation, including of librarians, professors and research scientists in the US and abroad. In January of 1999 the U.S. Court of Appeals for the Second Circuit rejected Gordon & Breach’s appeal of this decision.

Except for one possible final appeal by G&B in France, this appears to be the end of the case. APS and AIP can celebrate not only their victory (tempered only by the fact that Barschall, did not live to see it) but also their steadfastness and willingness, in the face of a large drain of money. The credit for this persistence belongs not only, or even primarily to the determined operating officers (in the case of AIP, first Kenneth Ford and then Marc Brodsky), but to the Council and Executive Board of APS and the Governing Board of AIP. On the many occasions on which the case came before these bodies for discussion and decision, in the face of some sentiment to settle it, even on restrictive and dangerous terms, the representatives of the membership stood firm. Among the APS presidents during the period, Burton Richter and Kumar Patel should be singled out for their steadfast leadership; in the AIP, Hans Frauenfelder led the Governing Board in refusing to abandon the case, except on honorable terms. Although, the case had no broad economic implications for AIP and APS, the societies pursued it as a battle in defense of truth, free expression and the competence and integrity of a valued member.

Harry Lustig
Professor of Physics and Provost Emeritus
City College of the City University of New York
Treasurer Emeritus of the American Physical Society
Adjunct Professor of Physics at the University of New Mexico.

References/Footnotes

  1. H. Lustig, "To Advance and Diffuse the Knowledge of Physics--an account of the one-hundred year history of the
  2. American Physical Society", Am. J. Phys. 68, 595 (2000).
  3. Kiess v. Rubin, Civ. No 95-CV-01267 (Md. Cir. Ct. 1995)
  4. Solarex Corp. v Arco Solar, Inc., 121 F.R.D. 163, 179 (E.D.N.Y. 1988); Arcos Solar, Inc. v. American Physical
  5. Society, 870F.2d 642 (Fed. Cir. 1989)
  6. David Lazarus, "In Defense of Confidentiality", Phys. Today 42, 57 (1989).
  7. Henry H. Barschall, "The cost-effectiveness of physics journals", Phys. Today 41, 56 (1988).
  8. H.H. Barschall and J.R. Arrington, "Cost of physics journals: a survey", Bull. Am. Phys. Soc. 33, 1437-1447 (1988).
  9. Gordon and Breach Science Publishers S.A. v American Institute of Physics, 859 F. Supp. 1571 (S.D.N.Y. 1994).
  10. OPA (Overseas Publishing Ass'n) Amsterdam B.V. v. American Inst. of Physics, 973 F. Supp. 414 (S.D.N.Y. 1997)
  11. Irwin Goodwin, "Federal Court Rules for APS and AIP in Dispute with Gordon & Breach over Survey of Journals", Phys. Today 49, 10 (1997).
  12. Extensive documentation about this case is available on the Web at http://barschall.stanford.edu and at http://www.library.yale.edu/barschall

Experiments In International Benchmarking of U.S. Research Fields

Marye Anne Fox and Robert M. White
As a nation, we support a large research enterprise, whose funding must be consistently and fairly allocated. At present, there is no reliable tool to allow for the evaluation of the quality of federally funded research programs and provide a basis for the subsequent allocation of funds to those programs. The National Academies Committee on Science, Engineering, and Public Policy (COSEPUP) has been testing the validity of international benchmarking, comparing the quality and impact of research in one country or region with world standards of excellence, as a possible tool for funding allocation. COSEPUP has run a series of experiments to test the use of benchmarking as a tool to understand the relative world standing of US research in a field and to understand the factors that are critical to US leadership in that field. This article describes the background behind benchmarking and the methodologies and results of COSEPUP’s experiments in internationally benchmarking three fields: mathematics, materials science and engineering, and immunology.

Background
In 1993, COSEPUP issued its report Science, Technology, and the Federal Government: National Goals for a New Era, which recommended that the federal government should continue vigorous funding of basic research and to seek to support basic research across the entire spectrum of scientific and technological investigation. Specifically, the report made two recommendations: First, that

"The United States should be among the world leaders in all major areas of science,"

and second, that

"The United States should maintain clear leadership in some major areas of science."

By following these recommendations, the U.S. would position itself among world leaders in all major fields of research and would be ready to apply and capitalize on research advances wherever they may occur.

Two years later, in 1995, a committee (on which I was a member) chaired by Frank Press, the former president of the National Academy of Sciences, stated that "to continue as a world leader, the United States should strive for clear leadership in the most promising areas of science and technology and those deemed most important to our national goals. In other major fields, the United States should perform on a par with other nations so that it is ‘poised to pounce’ if future discoveries increase the importance of one of these fields."

The committee also considered how the federal government could gauge the overall health of the research enterprise and determine the adequacy of national funding and whether it is supportive of national research objectives. The committee wrote that it is possible to monitor US performance with field-by-field peer assessments, which may be accompanied by:

"...the establishment of independent panels consisting of researchers who work in a field, individuals who work in closely related fields, and research ‘users’ who follow the field closely [can provide that kind of evaluation]. Some of these individuals should be outstanding foreign scientists in the field being examined."

The technique of comparative international assessments, or what we at COSEPUP came to call "international benchmarking," had been discussed at that time, but had not been practiced. From this, COSEPUP made the decision to undertake a set of experiments to test the utility of international benchmarks in evaluating entire research fields.

The committee acknowledged that quantitative indicators which are often used to assess research programs — for example, dollars spent, number of papers cited, and number of researchers supported all provide valuable information, but by themselves are not sufficient indicators of leadership. This quantitative information is often difficult to obtain or compare across national borders, illuminates only a portion of the research process.

COSEPUP decided that benchmarking should rely more prominently on the judgement of experts. The premise for this decision was that only leaders of a particular research field are in a position to judge leadership. COSEPUP charged each panel to provide answers to three primary questions:

  1. What is the position of US research in the field, relative to that in other regions or countries?
  2. On the basis of current trends, what will be our relative position in the near and longer-term future?
  3. What are the key factors influencing relative US performance in the field?
  4. The committee deliberately chose three fields that have a range of scope and subject matter. Of the three, mathematics is the closest to being a traditional discipline, but is broad in the sense that it has a language and is a tool used by other research fields. Immunology is not a disciplinary field in the traditional sense–it embraces many disciplines including biochemistry, genetics, and microbiology. Like immunology, materials science and engineering have even a broader span of disciplines.

Benchmarking Methodology
Determining how one country stands relative to another in science involves a great deal of subjective judgement. The first critical step in an international benchmarking process is the selection of panel members. Once selected, each panel divided their field into subfields. Then each panel used a variety of methods to assess their subfields, including:

  • "Virtual congress"
  • Citation analysis
  • Journal publication analysis
  • Quantitative data analysis (e.g., number of graduate students, funding)
  • Prize analysis
  • Invited speakers at international congresses

The first method used was the "virtual congress," where each panel asked leading experts in the field to identify the "best of the best" researchers for particular subfields, anywhere in the world. For example, the panel members of the materials science and engineering panel asked colleagues, for nine different subfields, such as ceramics and polymers, to identify five or six current hot topics and eight to ten of the best researchers in the world. The information was used to construct tables that characterized the relative position of the United States in each of the subfields.

While the "virtual congress" may determine how the U.S. currently stands relative to other countries, additional factors must be considered to predict its future ranking. The materials science and engineering panel developed what it called "determinants of leadership." These are factors that indicate the level of research likely to occur in the future. They include:

  • National Imperatives
  • National Innovation Systems
  • Major Facilities
  • Human Resources
  • Funding

National imperatives refer to national objectives, whose byproducts are likely to produce scientific leadership. The Cold War, for example, drove the development of materials for stealth aircraft.

Another method used by the panels was citation analysis, which is traditionally used to evaluate the international standing of a country’s research in a field. Each panel used an analysis by the United Kingdom Office of Science and Technology to evaluate US research quality. This analysis included both the numbers of citations and the "relative citation impact", which compares a country’s citation rate (the number of citations per year) for a particular field to the worldwide citation rate for that field. This latter measure takes into account the size of the US enterprise relative to that in other countries. (See the Web edition of this issue for tables containing some illustrative numerical results. [powerpoint format file])

The immunology panel used a method called "journal publication analysis." The panel identified four leading general journals and one of the top journals that focused specifically on immunology. Panel members analyzed the tables of contents of each of the journals. In the general journals, they identified immunology papers and the laboratory nationality of the principal investigator; and in all the journals, they identified subfields. That allowed a quantitative comparison between publications by US-based investigators and publications by investigators based elsewhere.

The panels found it difficult to obtain suitable, unbiased quantitative information which would allow comparisons of the major features of the scientific enterprise, (e.g. education, funding and government agencies) in different countries. Mathematics was among those exceptions and illuminates the use of human resources as a consistent source quantitative data. Taken from the American Mathematical Society, attachment III shows that the number and proportion of non-US PhD recipients in mathematics increased by 78% from 1985 to 1995. Furthermore, in every year since 1990, foreign students have received more than half the PhDs awarded in mathematics in the United States.

Each of the panels analyzed the key prizes given in its field. For example, in mathematics, the key international prizes are the Fields medal and the Wolf prize. The numbers of non-US and US recipients of these medals were analyzed on the basis of where they now conduct their research.
Another condition that can be quantified is representation among the plenary speakers at international conferences. Although conference organizers strive for geographic balance by inviting speakers from different countries, speakership is a useful indicator of quality by comparing the relative number of US invited speakers to the relative number of publications.

Benchmarking Results
Each panel concluded that the US was at least among the world leaders in its field. However, each panel identified subfields in which the United States lagged the world leaders. Each panel also identified key infrastructure concerns.

The mathematics panel found that although the United States was currently the world leader in mathematics, this leadership was dependent upon foreign talent that came to the United States particularly preceding and during World War II and following the collapse of the Soviet Union. The difficulty of attracting U.S. talent to mathematics today and the unpredictability of recruiting foreign talent led the panel to be concerned about the future leadership status of the United States in this field.

In materials science and engineering research, the U.S. was among the few world leaders in this field. Other countries are more aggressively pursuing research in this field, as it is essential to economic growth. A particular concern is that major facilities are newer in other countries than in the United States, such as neutron sources in Europe and Japan, which are younger than U.S. sources by 20-30 years.

A panel found that the U.S. was the world leader in immunology. Despite the fact that U.S. financial investment in this field overshadows that of other countries, the U.S. was not the leader in all subfields. Of particular concern was clinical immunology since restrictions of managed care systems in the U.S. versus other countries make it more difficult to attract the patients needed for clinical studies.

COSEPUP Analysis of Benchmarking Experiments
After the panels’ work was completed, COSEPUP evaluated panel results independently along with comments from participants in a workshop, which included White House, congressional staff, and agency staff and disciplinary societies. The reviewers who took part in the National Academies normal report review process were of particular importance, as they are chosen to represent diverse industrial and academic backgrounds.
Based on these deliberations, COSEPUP found international benchmarking to be rapid and inexpensive compared with procedures that rely entirely on the assembly of a huge volume of quantitative information. In the words of one panel chair, "We were able to get 80% of the value in 20% of the time, for a far lower cost."

COSEPUP also found good correlation between findings produced by different indicators. For example, the qualitative judgments of the virtual congress were similar to the results of quantitative indicators, such as publications cited or papers delivered at international congresses. Lending credence to the technique was a parallel benchmarking experiment in mathematics conducted by the National Science Foundation that produced similar results to COSEPUP's math study, despite differences in panel makeup and the mandates of each organization.

The Government Performance and Results Act of 1993 (GPRA) emphasized the need for a method to assess the fruits of scientific and technological research investments by the federal government. COSEPUP’s GPRA report in 1999 concluded that the most effective means of evaluating federal research is to provide expert review of the quality, relevance, and leadership of research programs. Some elements of benchmarking may indeed provide useful input to agency strategies to comply with GPRA.

In summary, COSEPUP concluded that this experiment should be regarded as an encouraging first step toward the design of an efficient and reasonably objective evaluation tool. Additional benchmarking exercises could lead to more effective assessment methods, better understanding of the factors that promote research excellence, and better decision-making by those who fund science and technological innovation

Marye Anne Fox
Chancellor, North Carolina State University
Office of the Chancellor Box 7001/A Holladay Hall Raleigh, NC 27695-7001
(919) 515-2191 Fax: (919) 831-3545

Robert M. White
Laboratory for Advanced Materials Dept. of Materials Science and Engineering
Jack McCullough Bldg., Rm. 349, MC 4045 476 Lomita Mall
Stanford University, Stanford, CA 94305-4045
(650) 736-2152 Fax: (650) 736-1984

References

  1. Committee on Science, Engineering, and Public Policy (COSEPUP). 1997. International Benchmarking of US Mathematics Research. Washington, DC: National Academy Press.
  2. Committee on Science, Engineering, and Public Policy (COSEPUP). 1998. International Benchmarking of US Materials Science and Engineering Research. Washington, DC: National Academy Press.
  3. Committee on Science, Engineering, and Public Policy (COSEPUP). 1999. International Benchmarking of US Immunology Research. Washington, DC: National Academy Press.
  4. Committee on Science, Engineering, and Public Policy (COSEPUP). 2000. Experiments in International Benchmarking of US Research Fields. Washington, DC: National Academy Press.
  5. Committee on Science, Engineering, and Public Policy (COSEPUP). 2000. Evaluating Federal Research Programs: Research and the Government Performance and Results Act. Washington, DC: National Academy Press.
  6. National Research Council (NRC). 1995. Allocating Federal Funds for Science and Technology. Washington, DC: National Academy Press.
  7. United Kingdom Office of Science and Technology. 1997. The Quality of the UK Science Base. London, UK: Department of Trade and Industry. March.

Photovoltaics: Energy for the New Millennium

Thomas Surek

Introduction
Photovoltaics (PV) is a semiconductor-based technology that directly converts sunlight to electricity. The stimulus for terrestrial PV started more than 25 years ago in response to the oil crises of the 1970s resulted in major government programs in the United States, Japan, Europe, and elsewhere. Ongoing concerns with the global environment, as well as the worldwide efforts to seek alternate, local sources of energy, continue to drive the investment in PV research and deployment. Today, the manufacture, sale, and use of PV has become a billion-dollar industry worldwide, with more than 200 megawatts (MW) of PV modules shipped in 1999.
Over the past 25 years, research and development has led to the discovery of new PV materials, devices, and fabrication approaches; continuing improvements in the efficiency and reliability of solar cells and modules; and lower PV module and system costs. This article reviews the rapid progress that has occurred in PV technology from the laboratory to the marketplace, including reviews of the leading technology options, status and issues, and key industry players. Major ongoing efforts involve a better understanding of the crystal and film growth processes and the resulting material properties. New processes for fabricating PV materials and devices, and innovative PV approaches with low-cost potential are elements of an ongoing research program aimed at future improvements in PV cost and performance.

Progress, Status, and Research Directions of Photovoltaic Technologies
Photovoltaic technologies can be divided into two main areas: flat-plates and concentrators. In the flat-plate technologies, semiconductor material is used to cover as much area as possible on a flat surface, while balancing tradeoffs between material cost and conversion efficiency of light into electrical power.

Flat-plate technologies include thick cells of crystalline silicon (from both ingot and sheet-growth techniques) and thin-films. Thin-films are typically less than 100m m of material (e.g. amorphous silicon, copper indium diselenide, cadmium telluride, and polycrystalline silicon) deposited using vapor deposition, electrodeposition, or wet chemical process. Present thin-film approaches generally do not allow conversion efficiencies as high as those demonstrated by crystalline silicon modules. In spite of this, development of thin-film approaches is an active area of research, since thin-film cells require 1/10th to 1/100th the amount of expensive semiconductor material compared to that required by crystalline silicon cells with equal collection areas.

Table 1 presents cell and module conversion efficiencies (percentage of sunlight converted to electricity under standard conditions) for both ingot- and non-ingot-based crystalline silicon technologies. Although there are specific areas for improvement associated with each of the crystalline silicon sub-technologies, general research areas that apply to crystalline silicon include:

  1. manufacturing yield and throughput,
  2. impurity/defect gettering and passivation,
  3. low-cost, high-efficiency processes,
  4. environmentally benign processing and waste stream reduction,
  5. manufacturing automation and module packaging for 30-year life,
  6. thinner wafers or sheets and associated handling,
  7. wire-saw slurry recycling (ingots only), and
  8. new processes to produce "solar-grade" silicon.

Table 1. Crystalline Silicon PV Conversion Efficiencies (%)

Material Cell Module
Float-zone 24-25 22-23
Czochralski 21-23 13-15
Cast Polysilicon 18-20 10-13
EFG Ribbon 14-15 10-13
Dendritic Web 15-17 14
String Ribbon 14-15 10-12
Thick Substrate Silicone 16 9-10

Table 2 presents conversion efficiencies for thin-film cells and modules. Similar to crystalline silicon, increased manufacturing throughput and yield and improved conversion efficiency are primary concerns for all thin films. Special attention is directed towards reducing the gap between laboratory cell efficiencies and production module efficiencies. Specific amorphous silicon research is directed to the following areas:

  1. novel growth techniques that allow higher growth rates and better materials and
  2. improved fundamental understanding with the goal of improved material stability and long-term field performance.

Current cadmium telluride R&D includes work addressing the issues of:

  1. improved film deposition,
  2. better contacting techniques for extracting electrical power from the cells, and
  3. low-cost module packaging for long-term reliability.

Current R&D areas for copper indium diselenide are:

  1. scalability of production processes,
  2. new deposition techniques and materials that lend themselves to lower temperature and non-vacuum approaches, and
  3. improved understanding of the device physics at the active semiconductor junction.

R&D for thin-film silicon includes developing techniques for high-rate deposition of large-grain-size films on foreign substrates.

Table 2. Thin-Film PV Conversion Efficiencies (%)

Material Cell Module
Amorphous Silicon 12-13 6-8
Cadmium (CdTe) Telluride 15-16 9-11
CuInSe(CIS) 18-19 10-12
Polycrystalline Silicon 8-12 N/A


Concentrator technologies are generally of two types: low concentration (typically 10 to 40 suns) which uses line or one-dimensional focus, and high concentration (typically 100 to 1000 suns) which uses point or two-dimensional focus. In addition to balancing material cost and conversion efficiency, portions of the more expensive semiconductor material in flat-plate systems are replaced with a system of lenses or reflectors that can be made from less expensive materials. This replacement may, however, be at the expense of overall system efficiency, and thus one should consider each system as a whole in evaluating its benefits.

Table 3 presents solar conversion efficiencies for various materials that lend themselves well to the somewhat higher module operating temperatures often found in concentrator systems. Note that the efficiencies are reported at particular concentration ratios since the efficiencies are a function of measurement conditions, including light intensity. Module efficiencies are in the range of 15% to 17 % for silicon-based systems, with prototypes of more than 20%. Modules using GaAs cells have efficiencies of more than 24%. A prototype module with a three-junction GaInP2/GaAs/Ge cell measured at 28% efficiency (10 suns).

Table 3. Cell Efficiencies for Concentrator PV Systems (%)

Material Concentration (suns) Efficiency
Silicon Up to 400 27
Gallium (GaAs) Arsenide Up to 1000 28
GaInP2/GaAs 1 30.3
GaInP2/GaAs 180 30.2
GaInP2/GaAs/Ge 40 to 560 32.3

General issues for concentrator systems include the structural characteristics of the system that lend themselves to larger applications and make the highly visible and currently more-prevalent small-application market less useful to concentrators in terms of establishing market position. An additional concern is that concentrator systems use essentially only direct radiation, and therefore their areas of best application require high-intensity sunlight, such as the southwest United States. Areas of R&D that are important for concentrators include, as in flat-plate PV, manufacturing yield and throughput and higher conversion efficiency to reduce ultimate energy cost. Higher efficiencies are expected from multijunction structures, including 3- and 4-junction devices. Novel concentrating techniques may also ultimately be incorporated into successful concentrator systems.

Markets and Applications
Photovoltaic systems may be used for almost any situation requiring electrical power, either tied to or independent of the utility grid. Systems may include energy storage and power conditioning to convert from DC to AC power. The size of the application may vary from milliwatts (as in calculators) to kilowatts (as in grid-tied, roof-mounted systems) to megawatts and larger (as envisioned with central-station generating systems). Consequently, the list of applications is long and includes developing country applications such as lighting, water pumping, health clinics, and village power; U.S. rural applications such as electric fences, water pumping for livestock, and irrigation; remote applications such as telecommunications and signaling; and grid-connected power generation on commercial and residential buildings.

Conclusions
While major market opportunities continue to exist in developing countries, where sizable populations are without any electricity, today's manufacturing expansions are fueled by market initiatives for grid-connected PV in residential and commercial buildings. The combinations of increased production capacities, with the attendant cost reductions as a result of economies of scale, are expected to lead to sustainable markets. A key to achieving the ultimate potential of PV is to continue to increase the sunlight-to-electricity conversion efficiencies and translate the laboratory successes to cost-competitive products. Building a robust technology base is essential to overcoming the perceived high-risk transition into grid-connected PV. Such a base will make PV a globally significant contribution to our energy supply and environment.

Thomas Surek
National Renewable Energy Laboratory
1617 Cole Boulevard, Golden, Colorado 80401, U.S.A.

References:

  1. "Photovoltaics: Energy for the New Millenium, The National Photovoltaics Program Plan, 2000-2004," DOE/GO-10099-940 (January 2000).
  2. "Terrestrial Photovoltaic Technologies -- Recent Progress in manufacturing R&D", C. E. Witt, T. Surek, et al., Proceedings of the 34th Heat Transfer Conference (August 2000).

Has The Holy Grail Been Found?

For the past twenty-five years, the "holy grail" of particle physics has been the Higgs boson, the remaining undiscovered particle of the standard model. Now, there is evidence that it may have been detected at the LEP collider at CERN, but confirmation will have to wait for several years since the collider is being decomissioned in order to make way for the Large Hadron Collider, which will begin in 2006.

The greatest mystery in particle physics has been the origin of mass. In electrodynamics, the photon must be massless due to a gauge symmetry (classically, this is the same as the symmetry that says that you can add a gradient to the vector potential without changing the physics–this symmetry forces the photon to be massless). In the standard electroweak model, the weak interactions are mediated by gauge bosons, the W and Z. Just as in electrodynamics, the gauge symmetry forces the W and Z to be massless. Yet they have sizeable masses of 81 and 93 GeV, respectively. Similarly, the gauge symmetry forces all of the quarks and leptons to be massless, in contradiction to experiment.

The only way to give mass to the gauge bosons and fermions is the Higgs mechanism. This is very similar to the method in which the photon gets an effective mass in a superconductor; one introduces a scalar field (the Cooper pair in a superconductor), and the interaction of the gauge bosons (the photon in a superconductor) and fermions with the scalar field give them a mass. Thus, this mechanism requires the existence of a scalar field everywhere in space. No other mechanism for generating gauge boson masses is known. When the scalar field can be excited, the excitation is called a Higgs boson. It is an unusual particle in that its interactions with all particles are completely determined, but its mass is unknown.

It can be produced at an electron-positron collider in association with a Z-boson. The Higgs decays into two quarks, and the Z decays into either two quarks or two leptons. The mass range that can be probed is very sensitive to the beam energy---masses up to roughly the center of mass energy minus the Z mass can be probed. Over the past decade, the energy of LEP has gradually been increased, and the lower bound to the Higgs mass has gradually increased as well. It reached 70 GeV five years ago, and reached 110 GeV a year ago. This has caused increasing nervousness among theorists, since their favorite theory---supersymmetry–has a rather firm upper bound of around 130 GeV.

LEP ran at its highest energy ever this year. Plans called for LEP to shut down for good in October, so that construction could begin on the Large Hadron Collider (LHC). The LHC will be able to cover the range of Higgs masses up to several hundred GeV. It is scheduled to begin in 2006. Suddenly, late in the summer, one of the detectors (ALEPH) at LEP found a couple of events that looked very much like a Higgs boson of mass 115 GeV. After much discussion, LEP was allowed to keep running for another month, until November, to accumulate more data. During that month, more events were found. For a Higgs mass of 115 GeV, the expected total signal for the entire LEP run is 3 events, with a background of 1.7 events. They detected 4 (including two different channels). This gave a 2.9 standard deviation signal. Although the experimenters argued for another few months of running next year (which could give a 5 standard deviation signal), CERN decided to shut the accelerator down. Continuing running would delay the LHC by a year, and cost a lot of money.

This is, of course, very frustrating for the LEP experimenters. If the Higgs mass were 112 GeV, they would easily have a 5 sigma detection; if it were 116 GeV, there would be no hint of a signal. If the Higgs boson really has a mass of 115 GeV, then it is likely to be detected in approximately three years at the new upgrade of the Fermilab Tevatron, before the LHC is comissioned. The politically interesting question of who would take credit for this discovery will be left as an exercise for the reader.

Enhancing The Postdoctoral Experience - New Nas Report

In September, a new National Academy report was issued by the Committee on Science, Engineering and Public Policy. The title is "Enhancing the Postdoctoral Experience for Scientists and Engineers: A Guide for Postdoctoral Scholars, Advisers, Institutions, Funding Organizations and Disciplinary Societies". The report can be read at http://www.nap.edu/books/0309069963/html

The number of postdocs in science and engineering has doubled over the past twenty years to approximately 52,000. Three-quarters of these are in the life sciences.

The Committee noted that "One might expect that such a talented group of researchers, who represent some of the brightest lights of our nation’s human resources, would be offered the finest educational and training opportunities. One might also expect that these individuals would be helped to move efficiently and quickly into challenging employment where they can maximize their contributions. Indeed, the committee did find that many postdocs have stimulating, well-supervised and productive research experiences. However, we also heard from postdocs who are neglected, underpaid and even exploited…In many university settings, postdocs have uncertain status; they are neither faculty, staff, nor students. Consequently, there is often no clear administrative mechanism to assure their fair compensation, benefits or job security…This uncertain status contrasts poorly with that of graduate students, faculty and staff members, and with that of most postdocs who work in government or industry settings…. typical in academia of the life sciences and some of the physical sciences, pay and benefits are embarrassingly inadequate, especially for those with families. It is not comparable to that received by other professionals at analogous career stages. There is no standard health benefit package for postdocs. Many receive no health benefits for their families…".

The Committee gave numerous recommendations. The Chair, Maxine Singer (President of the Carnegie Institution of Washington) gave a brief summary in a public briefing. The ten "action points" are:

  1. Award institutional recognition and status commensurate with the contributions of postdocs to the research enterprise.
  2. Develop distinct policies and standards for postdocs in the institutions where they work–most especially in universities. These policies can be modeled on those already available to students and faculty.
  3. Develop mechanisms for frequent and regular communication between postdocs, their advisers, institutions and funding organizations. This communication should include clear initial expectations on the part of both postdoc and adviser.
  4. Submit formal evaluations, at least once a year, of each postdoc’s performance. Without evaluations, some postdocs may be uncertain about their standing or progress.
  5. Ensure that postdocs have access to health insurance and institutional services.
  6. Set limits for total time as a postdoc. This should be approximately five years, including time at all institutions, and exceptions should be clearly described.
  7. Invite the participation of postdocs when creating standards, definitions and conditions for appointments.
  8. Provide substantive career guidance to improve postdocs’ ability to prepare for regular employment.
  9. Improve the quality of data both on postdoctoral working conditions and on employment of postdocs.
  10. Take steps to improve the transition of postdocs to regular career positions.

In addition to the full report, a Web guide containing resources and examples of "best practices" is available at http://www.national-academies.org/postdocs.

The Presidential Appointment Process For S&T - New Nas Report

A new National Academy report entitled "Science and Technology in the Public Interest: The Presidential Appointment Process" was released in September by a panel of the Committee on Science, Engineering and Public Policy, chaired by Dr. Mary Good. The full report can be read at http://www.nap.edu/books/NI000314/html/ The report finds that there are large numbers of obstacles to government service today, and that these obstacles have reduced the pool of talented people who are willing to serve in S&T presidential appointments.
The briefing began with several myths and realities. The first myth is that "there is no hurry in appointing science and technology leaders because they are coming to Washington primarily to manage long-term, slow-changing research programs". The reality is that these appointees are needed as soon as possible during a new administration, especially in choosing the assistant to the president for S&T. This individual is needed to help set priorities, plan strategy, and find qualified candidates for other S&T appointments. Another myth is that S&T appointees are drawn from a pool of qualified candidates that is virtually unlimited in breadth and depth. In reality, the pool is not broad enough. As an example, very few appointees are recruited from industry (only 12 percent in the Clinton years), and even these have too few with managerial experience in pharmaceuticals and chemicals, and even fewer in biotechnology and information technology. There are too many obstacles to government service. They note that "a term in Washington for scientists and engineers often means two steps backward for every step forward along a career path. They may lose touch with the cutting edge of their field and find themselves in an irreversible career shift toward management,,,,A move to Washington to undertake an appointment might require severing all ties with employers, forgoing pension benefits, selling stock, options, etc."

Another problem is the long appointment process---only 45% are completed within four months.

The following recommendations are made in the Report. (1) Initiate the appointment process for key S&T leadership early. The first important step toward building scientific and technical competence in a new administration is to ensure that the transition team has expertise in science and technology. Soon after election, the president-elect, with the help of these advisers, should quickly identify a trusted and respected candidate for the position of assistant for science and technology. This individual is needed early to help identify S&T leaders for agencies and departments, set priorities for the new administration and work out budget strategy. (2) Increase the breadth and depth of the pool of candidates by reducing the financial and vocational obstacles to government service. Because many restrictions are statutory, substantive change requires the participation of Congress. The Committee recommends that the president and Congress immediately establish a bipartisan framework---including representatives of the Executive branch, Congress and the Office of Government Ethics–to identify actions that can broaden and deepen the pool of candidates. The panel hesitated to recommend this step because of the time needed to design such a framework and implement reforms. However, given the complex legal nature of the issues, bipartisan discussion is the only practical avenue to long-term solutions. (3) Accelerate the approval process. The White House should streamline its own approval procedures and work closely with the Senate to speed the entire process. The president-elect should, in collaboration with the Senate, adopt the goal of completing 80-90% of appointments within 4 months, which was the norm from 1964 to 1984. If additional staff are needed to meet that goal, special funding should be requested from Congress.

Good News For Science Budgets

Last spring, the President’s budget for science was extremely generous, with double digit increases for most agencies (that’s double digit percentages, not dollars!). The initial budgets from Congressional committees were far less generous, due to very severe ceilings on appropriations. Most appropriators at the time predicted that those ceilings would be lifted, and that the allocations would then increase. They were correct, and substantial increases were passed. The summaries below are taken directly from the American Institute of Physics FYI (http://www.aip.org/enews/fyi).

National Science Foundation: The NSF’s budget was increased by 13.6%, an amount which NSF Director Rita Colwell said "represents the largest dollar increase the Foundation has ever received, in real or constant dollars."
For perspective on this increase, note that a year ago, Congress approved an 8.4% increase for NSF. For FY 2001, the president requested a 17.1% increase. NSF's new budget is $4.426 billion. This is more than half a billion dollars over last year, or $529 million.

Within this total budget are several major categories of spending. The Research and Related Activities account increased by 13.2% to $3.350 billion. This is about 2/3 of the requested increase. In the conference report, specific amounts of money are provided for some programs (see below). NSF is instructed to distribute remaining funds "proportionately and equitably, consistent with the ratio of the budget request level." After allowing for the specified funding, this 2/3 figure can be very roughly applied to the original requests to suggest the final percentage increases for the subactivities tracked by FYI. With these caveats, the subactivity increases originally requested by NSF were: physics was to have increased by 18.0%, materials research by 15.4%, astronomical sciences by 13.7%, engineering by 19.6%, and geosciences by 19.5%. The final subactivity budgets will be determined by NSF and approved by Congress in the FY 2001 operating plan.

For some programs, a calculation is not necessary because Congress specified the funding. These programs are: $65 million for plant genome research, $215 million for the information technology initiative, $75 million for the biocomplexity initiative, $75 million for major research instrumentation, $1 million toward a new research vessel, and $5 million for a Children's Research Initiative. Of particular interest to physicists and astronomers is $150 million designated for the new nanotechnology initiative and $94.9 million for facilities within the astronomical sciences activity. Regarding the latter, the conference report cites the Arecibo Observatory, the Green Bank Telescope, the Very large Array, the Very Long Baseline Array, "and other facilities in need of such attention on a priority basis."

Besides Research and Related Activities, there is another budget for Major Research Equipment. There is both good news and bad news. First the good news: "The conference agreement provides the budget request level for all ongoing projects," including $45 million for the development and construction of a second teraflop computing facility and $12.5 million for the continued production of the High-Performance Instrumented Airborne Platform for Environmental Research. The bad news: "Budget constraints have forced the conferees to not approve funding for two new starts for fiscal year 2001 . . . the U.S. Array and San Andreas Fault Observatory at Depth, and the National Ecological Observatory Network. This decision was made without prejudice and does not reflect on the quality of research proposed to be developed through these two programs."

Another major budget is that for Education and Human Resources. Congress approved an increase of 13.9%, as compared to the 5.0% requested by NSF. The new budget is $787.4 million, up by $96.5 million. As was done with the Research and Related Activities budget, funding was specified for some programs, including $75 million for EPSCoR, $10 million for the Office of Innovative Partnerships, and specified amounts for a variety of other programs and initiatives. Remaining funding is to be distributed proportionately and equitably, consistent with the original budget request. Also specified is funding for Polar Programs. NSF had requested an increase of 12.8%; the final bill provides 8.9%. This increase of $22.6 million brings FY 2001 funding to $275.6 million.
NSF Director Colwell released a statement expressing her appreciation to all of those involved in supporting the new budget. She stated: "This increase also puts us on the path towards doubling the NSF budget in five years, a goal championed by Senate VA-HUD Chairman Kit Bond, Ranking Member Barbara Mikulski, Senate Majority Leader Trent Lott and more than 40 members of the Senate." Colwell continued, "This historic achievement validates the Administration's commitment to investing in fundamental research and education - and I thank President Clinton, Vice President Gore, Science Advisor Neal Lane, OMB Director Jack Lew and his staff for their leadership in helping to achieve such a great result. It was truly a team effort. While the VA-HUD agreement did not reach the President's request for NSF, the funding level provided is extraordinary and demonstrates how support for fundamental research and education is truly bipartisan.

"Along with Senators Bond and Mikulski, I personally thank House Subcommittee Chairman Jim Walsh and Ranking Member Allan Mollohan for their constant, steadfast support of NSF. I also thank all the VA-HUD subcommittee members in both the Senate and the House, House Full Committee Chairman Bill Young, Ranking Member David Obey along with Senate Full Committee Chairman Ted Stevens and Ranking Member Robert Byrd for their excellent leadership and consistent support of the Foundation's investments in research and education. They are true champions for these critical investments in the nation's future investments that will help improve the health, prosperity and well-being of all citizens in the 21st century.
"I also recognize the extraordinary efforts of leaders in the science and engineering community, as well as those in industry and academia on behalf of the Foundation's budget request. This result is due to the exceptional contributions of so many individuals, both at the National Science Foundation and in the broader community. I am grateful to all those individuals and organizations that have helped make this budget a reality."

NASA
The House and Senate have reached agreement on final FY 2001 funding for NASA, as part of the combined Energy and Water Development and VA/HUD appropriations conference report. Under the conference report, which President Clinton is expected to sign, NASA would receive a very healthy $14,285.3 million. This is more than Clinton, the House, or the Senate had previously proposed for the fiscal year that began on October 1. This funding level is 5.0% ($685.0 million) greater than NASA's FY 2000 appropriation, and 1.8% ($250.0 million) higher than the President's request.

Space Science, Earth Science, and Life and Microgravity Sciences would see increases over both their FY 2000 funding and their requests, while Human Space Flight would remain essentially flat. NASA Administrator Daniel Goldin's pleasure with the final conference numbers was evident in a press release issued on October 20. "This measure provides an excellent budget for NASA," Goldin said in the release. "The bill fully funds the President's program for NASA, including all high-priority initiatives - the Space Launch Initiative, Shuttle Upgrades, the International Space Station, and Living With a Star. The bill includes funding, as proposed by NASA, for two Mars rover missions in 2003."

NASA's plan for two Mars landers within its space science program is a recent proposal by the agency, developed after several reviews of the loss last year of the Mars Orbiter and Polar Lander missions. The agency submitted a revised FY 2001 budget proposal that would shift funds from other NASA programs to the Mars effort, demonstrating the importance NASA and the Administration place on a reinvigorated Mars program. The conferees support this proposal, which would provide $75.0 million for the Mars program as follows: $2.0 million from elsewhere within the space science account; $7.0 million from life and microgravity sciences; $20.0 million from aeronautics and space technology; $6.0 million from mission support; and $40.0 million from human space flight.

The conference report contains numerous earmarks and detailed language on specific reports and requirements that Congress expects, some in agreement with House or Senate report language, some that supersedes the previous language. (Recent language in the Senate report takes precedence over the earlier House language, and language in the conference report takes precedence over both. Theoretically, if a later report makes no mention of a certain requirement or earmark, the earlier text remains in effect.) The complete text of the VA/HUD-Energy and Water Development conference report can be found on THOMAS, the Library of Congress web site, at http://thomas.loc.gov. Go to "Committee Reports" for the 106th Congress and choose report number 106-988; the NASA portion begins on page 151. Highlights of selected portions of the report can be found at FYI #129 at http://www.aip.org/enews/fyi00.129.cfm.

DOE
The final numbers for the DOE are still unclear because of budgetary adjustments that were included in the bill after various appropriations were made. The effect of these adjustments is to reduce the amounts shown; by exactly how much is uncertain. One adjustment for about $38 million is for "safeguards and security costs." This is to be funded out of some, but not all, of the accounts in the over-all budget for DOE "Science." The other adjustment is for what is called "a general reduction." This general reduction of about $34 million is to be taken out of the Science budget in a way yet to be determined. These two reductions amount to approximately $72 million, or 2.2% of the total Science budget. In addition, the administration amended its first budget request. The saying, "the devil is in the details" applies to the following numbers, and the details are not yet known. While these numbers are fairly firm, minor changes are possible, depending upon how that general reduction is made. Finally, these are aggregate numbers, and do not reflect directed spending specified by Congress or changes in program content.

High Energy Physics:
Last year's budget was $703.8 million. For fiscal year 2001 the administration originally requested $714.7 million. The FY 2001 appropriation is shown as $726.1 million, an increase of 3.2% over last year. This FY 2001 appropriation, when subjected to the above reductions, could be $712.7 million.

Nuclear Physics:
Last year's budget was $355.8 million. For fiscal year 2001 the administration originally requested $369.9 million. The FY 2001 appropriation is shown as $369.9 million, an increase of 4.0% over last year. This FY 2001 appropriation, when subjected to the above reductions, could be $360.1 million.

Basic Energy Sciences:
Last year's budget was $779.4 million. For fiscal year 2001 the administration originally requested $1,015.8 million. The FY 2001 appropriation is shown as $1,013.4 million, an increase of 30.0% over last year. This FY 2001 appropriation, when subjected to the above reductions, could be $992.4 million.
The Basic Energy Sciences budget contains funding for the Spallation Neutron Source, which accounts for the large increase. The final bill states: "The recommendation includes $278,600,000, including $259,500,000 for construction and $19,100,000 for related research and development, the same as the amended budget request, for the Spallation Neutron Source."

Fusion Energy Sciences:
Last year's budget was $247.8 million. For fiscal year 2001 the administration originally requested $247.3 million. The FY 2001 appropriation is shown as $255.0 million, an increase of 2.9% over last year. This FY 2001 appropriation, when subjected to the above reductions, could be $248.5 million.

Dynamical Modeling Of The Onset Of War

By Alvin M. Saperstein. World Scientific, Singapore, New Jersey, London, Hong Kong, 1999, 148 pages, $38.

This book was written to serve two purposes. It is a review of past efforts to model arms races in general, and the onset of war specifically. Secondly it seeks to explain the role of chaos in arms races. Saperstein argues that the onset of war is generally caused by crisis instabilities in the international system. And since the transition from predictability to chaos is such an instability, a system with a tendency toward chaos is likely to break out in war.
Saperstein distinguishes between static and dynamic models of arms races. He summarizes and discusses the static models of Kaye, Legault and Lindsey--with their regions of stability and instability. However, he feels that static models are inadequate because they include nothing of the interactions between the participants in the international relations. He is somewhat more satisfied with existing dynamic models, such as those involving the Richardson equations and their extension by Lee, Zinnes and Muncaster.

However, the existing models tend to be linear coupled equations, which he says cannot represent the complex real world. He believes that non-linear equations coupling participants in international relations are required to describe the real world. And he shows that such non-linear equations can lead to chaotic behavior. One strength of this book is the clear distinction drawn between predictable and unpredictable futures. To Saperstein, chaos represents situations in which a small change in an input parameter can lead to a very different final outcome. In such situations, knowing the initial parameters with high precision still does not allow a prediction of the future outcome even with large uncertainties. He argues that such chaotic futures are likely to lead to the outbreak of war.
Dynamical Modeling is aimed at graduate students in the social sciences, students who want to understand the role of chaos in the modeling of international systems. These students indeed can benefit not only from the reviews, but also from the explanations of "crisis instability" in mathematical terms. Unfortunately the mathematics is dense; but no denser than it is in the equivalent social-science literature. The book may be even more useful for the physical scientist who is interested in understanding war in a more mathematical way. Its reviews and critiques of static models are good. Its reviews of dynamic models explore well the mathematical complexities of coupled non-linear equations with many parameters. Saperstein cites the best literature of the field.

The new modeling approach of Saperstein in this book, namely the incorporation of chaos, is aimed at both social and physical scientists. The book contains reproductions and summaries of essays on chaos previously published by Saperstein. The mathematics is formidable because of the many parameters and coupled equations. The mathematical inclinations of the physical scientists are reasonably satisfied, as the equations are solved either completely with no input of specific real-world parameters, or else are solved by iterative steps on a desk computer in very generic terms. The social scientist is likely to be reasonably satisfied because the mathematics confirms the intuitive understanding previously derived from non-quantitative analyses. Some sample conclusions from chaos theory: (1) Three nation systems seem more likely to lead to war than two-nations systems. [We should mourn the end of the Cold War?] (2) If nations fear and loathe each other less, the chance of war is reduced. (3) Democracies are less prone to war.

Perhaps the greatest contribution of this book is how Saperstein relates the social-science concept of "crisis instability" to the mathematical concept of "chaos". Each of these is characterized by the fact that a small variation in the input parameters (e.g. an assassination), can lead to enormously different outcomes (e.g. a World War).
For me what is missing are specific calculations for actual situations. I found myself wanting to know what values a "loathing" parameter might take; and what might be the value of the coupling coefficient between the building of an NMD by the United States and the resulting building of more ICBMs by China. This neglect of numbers is deliberate. Saperstein says: "... the goal is to illustrate method, and to develop insight and intuition, rather than to achieve realistic results."(p. 115) But I personally understand complex relationships best when illustrated by specific examples. When this book mentions specific real-world examples such as the instability created by deploying an NMD, it does not give actual calculations. Instead, the book refers to articles containing the details. But a book is usually an extension of shorter articles, so I had hoped to find detailed calculations and applications to justify the intuitions. I found myself agreeing with Lord Kelvin who said that he only understood something that he could put into numbers.

I recommend that you read the book, then go to some earlier Saperstein articles where he works out some detailed calculations, and then play around with the Richardson model and with the two- versus three-nations coupled non-linear equations.

There are some mistakes in the text due to insufficient proof reading. Two notable mistakes are on p. 102, where the parameter epsilon should equal zero (NOT one) to reduce the three-nation system to the two-nation system; and p. 118 where in Eq. (1) the parameters x(j) should be coupled not to x(i) but to the time derivative of x(i).

Dietrich Schroeer
Physics Department
University of North Carolina at Chapel Hill.

The Woman Who Knew Too Much: Alice Stewart (1906-) and the Secrets of Radiation

by Gayle Greene, University of Michigan Press 2000, ISBN 0 472 11107 8, #19.95 <www.alicestewart.org>.

[This review first appeared in the Times Higher Education Supplement (United Kingdom), and is reprinted here by permission.]

The devastation of Hiroshima and Nagasaki in 1945 was plain to see. But the true horrors of radiation damage to thousands of survivors were concealed by the nuclear establishment and self-deluding politicians; worse, their gross underestimates became benchmarks. One of the courageous few to challenge this, from the 1970s, was the epidemiologist Alice Stewart. She had long been a thorn in the side of the medical establishment, for she uncovered the true hazards, namely childhood leukaemia or cancer, of X-raying pregnant women.

Like her mother, she was a pioneer doctor, qualifying in a female medical school. This followed a dazzling intellectual life at Cambridge in the 1920s: her circle included Cartier-Bresson, Redgrave, Lowry, Bronowski, Trevelyan, Alistair Cooke, Anthony Blunt, Kathleen Raine. She heard Virginia Woolf’s talk which became ‘A room of one’s own’. William Empson was her first and last love. Cambridge expelled him (after he had written ‘Seven Types of Ambiguity’, in two weeks) when his landlord reported finding ‘various birth control mechanisms’ in his room. They married other people--Empson went to Japan and China, she married Ludovick Stewart--coming together again after 30 years.

Ludovick’s appointment to Harrow school helped her career. She became a registrar at the Royal Free (where her parents met in 1901) and sharpened her diagnostic skills. She became a consultant at the Garrett Anderson Hospital in 1939, evacuating to St Albans with the war; Ludovick went to Bletchley Park, decoding, and their careers diverged. She moved into social medicine at the Nuffield Hospital in Oxford: with two children, she avoided call-up, and received help with child care. "The war enabled me to leap over barriers that would otherwise have blocked my way as a woman", she says, "It tells you what women could do if society would change its attitude."

She studied health risks of industrial chemicals in factories doing war work. With 40 undergraduate volunteers filling shells with TNT she showed that the risks of anaemia and liver disease were dose-related. The students’ blood counts recovered! As incentive, she set them in prize essays, published as Impressions of Factory Life. She co-founded, in 1946, the British Journal of Industrial Medicine. She studied pneumoconiosis among coal miners in Wales, working with their Communist leader, Arthur Horner. As ever, risks were downplayed. Miners dying at 55 were recorded as "succumbed to old age"; a miner who died of drink at 82, she reckoned, was an excellent advertisement for drink.

In 1943 John Ryle founded an Institute of Social Medicine in Oxford and appointed Stewart as his assistant. They were appalled by disparities in infant mortality at top and bottom of the social scale. After he died in 1950 Alice was made head of a Social Medicine Unit, as a reader (and fellow of Lady Margaret Hall), with no resources. She raised money to hire an assistant and a statistician, systematized the records, and began to study leukaemia.
Her Oxford Survey work was rich in detail which, with her insight, uncovered vital correlations. Exposure of either parent to ionising radiation correlated with childhood leukaemia, often recorded as "cot death." She fought to establish that there is no "safe" or threshold dose. Nowadays we recognise the hazards to cells in division, and so to the fetus--a single hit can cause mutation. X-raying of pregnant women stopped, and X-ray viewers disappeared from the shoe-shops.

Richard Doll was appointed Regius Professor in 1969 (Doll and Hill had linked smoking with lung cancer). Doll had little time for Alice Stewart, and research funding was difficult. When she retired in 1974 she and George Kneale, her genius statistician, moved with their Oxford Survey to Birmingham University, which appointed her Professor when she was nearly 90.

Invited to the US in the 1970s she could see workers at Hanford or Oak Ridge, producing nuclear weapons, dying of radiation-induced cancers. Safety standards were low, high exposures concealed, injury disputed, compensation was mean, and whistle-blowers were blacklisted. Gradually, her conclusions were confirmed by other scientists. When she was 80, she was awarded $2 million to study nuclear workers’ records from the US weapons complex.
Her evidence of the hazards of parental exposure to radiation around nuclear installations is still disputed. She showed supposedly safe levels to be too high; lowering them would open the floodgates to claims. A 1982 study by the National Cancer Institute, commissioned by Congress, estimated that Western Americans received doses of radioiodine in atom bomb fallout 100 times greater than those estimated in 1959, 10 times greater than at Chernobyl, leading perhaps to 10-75 thousand thyroid cancers, most as yet undiagnosed.

Her Lancet article in 1970 on "Gene-Selection Theory of Cancer Causation" quotes from an Empson poem: "How small a chink lets in so dire a foe." Slowly, surely, international researchers demolish what she calls "the gold standard," the long-standing, meretricious interpretation of A-bomb data. "Truth is the daughter of time," she says.
Gayle Greene’s book is well referenced, while betraying its American origin. It’s a good story, much of it in Alice Stewart’s own spirited words. Photographs show her family and friends, and her progress from a charming young woman to a lively indomitable ninety (plus)-year-old.

Nuclear weaponry has done more damage than Hiroshima, Nagasaki and nuclear testing. The killing of Lumumba, support of Mobutu and Kabila, and much of the subsequent tragedy of sub-Sarahan Africa, can be traced to "safeguarding" Congolese uranium. What will Sellafield do with its 60 tonnes of plutonium, half-life 24 millennia? The Barents Sea is described as "Chernobyl in slow motion"--what are those nuclear submarines doing? We need more whistle blowers like Alice Stewart.

Dr Joan Mason,
12 Hills Avenue
Cambridge CB1 7XA, U.K.

Forum Members "To-Do Box"

The APS Washington Office promotes the interests of the physics community by working with federal policymakers to advance APS policy positions. Critical to this effort is the active involvement of the APS membership. Activities planned for 2001 include letter writing and phone-in campaigns, a Congressional Visits Day scheduled for May 1, and congressional receptions. If you are interested in participating in APS advocacy activities contact Francis Slakey (Associate Director of Public Affairs) or Christina Hood (Public Affairs Fellow) at: opa@aps.org or 202/662-8700. The APS has adopted, through a majority vote of the Council, the following policy positions:

  • National Missile Defense The United States should delay deployment of the planned NMD system until it is shown - through analysis and intercept tests - to be effective against the types of countermeasures that an attacker could be expected to deploy.
  • The Comprehensive Test Ban Treaty The United States should ratify the CTBT. Fully informed technical studies have concluded that continued nuclear testing is not required to retain confidence in the safety and reliability of the nation's nuclear deterrent.
  • Sustainable Energy The United States should support investments and policies that ensure a broad range of energy options. Low-cost oil resources outside the Persian Gulf are being depleted. Energy-related urban air pollution is a world-wide threat to health. Atmospheric concentrations of carbon dioxide are climbing. Innovations and environmental policies are necessary for the sake of our national security, environmental well-being, and standard of living.
  • The Federal Support Of Research The United States should double the research budgets of the federal science agencies and maintain a balanced research portfolio. The ability of science to contribute to the nation's economic growth and security depends critically on agencies such as NSF and DOE. In particular, the DOE Office of Science constructs and operates most of the nation's major science facilities.
  • K-12 Science And Math Education The United States should increase targeted funding for science and mathematics education, particularly programs that provide teachers with quality preparation, resources, and professional development. Students must have a strong science education to compete in a high-tech economy.

Physics and Society is the non-peer-reviewed quarterly newsletter of the Forum on Physics and Society, a division of the American Physical Society. It presents letters, commentary, book reviews and articles on the relations of physics and the physics community to government and society. It also carries news of the Forum and provides a medium for Forum members to exchange ideas. Opinions expressed are those of the authors alone and do not necessarily reflect the views of the APS or of the Forum. Contributed articles (up to 2500 words, technicalities are encouraged), letters (500 words), commentary (1000 words), reviews (1000 words) and brief news articles are welcome. Send them to the relevant editor by e-mail (preferred) or regular mail.

Editor: Al Saperstein, Physics Department, Wayne State University, Detroit, MI 48202, (313) 577-2733/ fax (313) 577-3932. Articles Editor: Betsy Pugel, Loomis Laboratory, University of Illinois, Urbana-Champaign, Urban IL 61801. News Editor: Marc Sher, Physics Department, College of William and Mary, Williamsburg VA 23187, (757) 221-3538, and Phil Goldstone, Los Alamos National Lab. Reviews Editor: Art Hobson, Physics Department, University of Arkansas, Fayetteville, AR 72701, (501) 575-5918/fax (501) 575-4580. Electronic Media Editor: Marc Sher. Layout(for paper issues): Alicia Chang. Web manager for APS: Joanne Fincham