Archived Newsletters

Cuban Exchange

The International Union of Pure and Applied Physics [IUPAP] and other scientific organizations are up in arms because some Cuban nationals were denied US visas to participate in a scientific meeting here.[1] Freedom of scientists from totalitarian countries to travel to the United States is considered essential to the conduct of science. Presumably this applies only to those scientists that totalitarian governments allow to travel abroad while, at least in some cases, leaving members of their families behind as hostages. The fact that only persons whom the communist regime finds acceptable can work as scientists at Cuban universities and research institutions and that others, who are not communists or fellow travellers, have virtually no civil rights, cannot be appointed to university faculties, and cannot even leave the "people's paradise", appears to be of no concern to IUPAP or APS for that matter.

The Department of State publishes annually reports on how various countries observe human rights. As a rule, authors of these reports bend backward to find anything positive they can concerning the countries in question but in case of Cuba the most recent human rights report is devastating.[2] I believe it is appropriate to send a message to oppressive governments by, at the very least, denying visas to their officials and servants whenever practicable. The message is that they do not belong on the soil of a free country and in company of free men.

[1] Physics Today, July 1997, p.55

[2] Report on Human Rights Practices -- Cuba, United States Department of State, January 1997. Also available on <gopher dosfan.lib.uic.gov>. The report opens by observation: Cuba is a totalitarian state. President Castro exercises control over all aspects of Cuban life through the Communist Party and its affiliated mass organizations, the government bureaucracy, and the state security apparatus. The party is the only legal political entity, and President Castro personally chooses the membership of the select group that heads the party. The party controls all government positions, including judicial offices.

Vladislav Bevc

Synergy Research Institute

P.O.Box 561, San Ramon, CA 94583

(510) 837-7612

The letter above, protesting the defense, by American physicists, of the scientific rights of individual scientists coming to professional meetings from totalitarian countries, is quite understandable. Its sentiments are not uncommon among people who have lived and suffered under such regimes. However, it puts those who don't share the State Department's characterization of the Cuban government as a totalitarian state in the same boat as those of us who believe that scientific exchanges should be pursued, even with totalitarian regimes. We believe that the APS, and other representatives of American (and world) science, should make it clear to all that: (1) We are fully aware of the violation of human rights by the Castro regime and are against all such violations. (2) The international freedom of scientists should be above politics. We defend that freedom, independent of the politics of the scientist. Otherwise, we commit the same violations as do the dictators. (3) If we were to behave like dictators, we would weaken our own position in our fight against all violations of human rights. For example, if we refuse the visa of a party hack, then it would be extremely hard for us to fight for the dissident whom we wish to invite. (4) The way to help the deserving and yet oppressed is to keep the door open to all while at the same time inviting the deserving and yet oppressed. (5) In the shrinking global village to which totalitarian regimes must open their doors, they will be weakened if we behave consistently with our belief in the sanctity of human rights. Self-consistency is a non-negotiable cornerstone of science!

Ke Chiang Hsieh

Editor's Comment Should Iraqi physics students be invited to study in our universities, older physicists to our scientific conferences and laboratories, where they may learn or brush up on the newest paradigms of, e.g., nuclear physics? But that may help that "outlaw" state in its presumed quest for nuclear weapons to use against its "peaceful neighbors". (The same question applies to other "renegade" nations and to scientists of other disciplines who may help further the development of other weapons of mass destruction - or even conventional weapons which we don't wish them to acquire.) Raymond A. Zilmakas, in a 30 January 1998 Science editorial (vol. 279, p.635), advocates just such invitations. He would open up our universities and international forums to "science students from nations suspected of pursuing ... weapons", presuming that such invitations would open these students to "discussions of ethics in science." "Scientists imbued with a strong sense of ethics will be more inclined to slow the progress ... weapon-related research or alert outsiders..." Here we have a typical policy conundrum - both good and bad may follow from the same policy and we have no validated theory enabling us to choose so as to further desired ends, hinder undesired outcomes. All we can do is search the historical record for presumed similar dilemmas and examine their outcomes.

It is certainly true that American scientists, presumedly inculcated with American ethics, had no trouble (on the whole) in contributing to our developments of biological, chemical, and nuclear weapons. The same can be said for scientists in Hitler's Germany, Stalins's Soviet Union, and other countries of varying socio-political outlook. Could it be that these past scientists were not "imbued with a strong sense of ethics" and that future formal "discussions of ethics in science" would make a difference for our visitors from potentially hostile countries? I have my doubts. But would barring such outsiders from our forums and institutions contribute to the lessening of such hostility? Again, I have my doubts. It seems to me that this problem is a manifestation of the question of which comes first in the development of a human scientist: science or society? Our Forum should be a catalyst and home for examining this fundamental problem. I eagerly await further contributions on this topic from our readers.

The AAAS Committee on Scientific Freedom and Responsibility (co-chaired by Irving Lerch of A.P.S.)) is organizing a seminar in April in DC concerning these issues--with a focus on Cuban scientists. They intend to have representation from the State Department, other government agencies, various organizations, etc. A.M.S

Physicists Haven't Done Much?

In his letter in the January 1998 issue of Physics and Society, Bernard Cohen makes the astonishing assertion that nuclear and high-energy physicists "have contributed virtually nothing to technology for several decades, nor have we done much else of a practical nature for Society".

One of the most important "spin-offs" from all of science during the last decade is the World-Wide-Web. Although the value of the Web is certainly subject to hyperbole (no, it probably isn't the most important breakthrough in publishing since Gutenberg), it is clear that the Web will contribute much more to the Gross Domestic Product than all federal support for basic research combined. The Web began in a high-energy physics experiment at CERN in the late 1980's.

In fact, high energy physicists have been at the vanguard of the information revolution for decades (the preprint archive at SLAC will soon celebrate its 25th anniversary), and have been crucial in shaping its development. Many feel that the information revolution will have a bigger impact on Society than any "revolution" since the Industrial revolution, and thus Cohen's claim that we haven't "done much of a practical nature for Society" seems inaccurate, and detracts from his other, important points about nuclear technology.

Marc Sher
Physics Dept.
College of William and Mary
Williamsburg VA 23187

Excerpts from: Some Non-scientific Influences On Radiation Protection Standards And Practice

by Laursiton S. Taylor

[The original article appeared in Health Physics 1980; 32, pp 851-874. The excerpts were chosen by John Cameron with verbal approval from the author.]

I. Introduction:
Today, we know all we need to know to adequately protect ourselves from ionizing radiation. What is the problem and why is there one? [The problem] is not a scientific one. Rather, it is a philosophical problem...Or perhaps it may be a political problem ... or perhaps the problem may not be as much protecting ourselves against radiation as protecting us against ourselves. I shall mention, at least briefly, several non-scientific factors which may influence protection practices .... and thus, in turn, influence the setting of our numerical protection standards.

II. Biological Effects of Radiation:
Collectively there exists a vast array of facts and general knowledge about ionizing radiation effects on animal and man. ... the depth and extent of this knowledge are unmatched by any of the myriads of other toxic agents known to man. .... the public has come to expect sharp, clear, definitive, and undisputed answers to any questions involving radiation. This leads to the difficulty that when there is ... disagreement among scientists the public feels ... let down .. by the scientific community. A good example is the current so-called "controversy" ... centering around the effects of radiation delivered in low doses at low dose rates. Radiation effects are generally proportional to dose when delivered acutely in moderate amounts, say 100 rads and upwards. Precise proportionality is difficult to establish. The problem becomes especially critical in the low dose region say below 25 or 50 rads, delivered acutely, for which the latent period may be three or even four or more decades. During that long a period any individual would be subjected to hundreds of other insults, any number of which might produce the same effect as the radiation. There is uncertainty about the existence of threshold effects for ionizing radiation. There are very few threshold effects, although there are clearly some.

If one is concerned about the degree of hazard in the region where effects cannot be found or identified, to what extent should an attempt be made to further "reduce the hazard" to some fraction of what could not be found in the first place? The question is "how large is half of something that cannot be measured?"

Today we know enough about dose-effect relationships to state unequivocally that at least for low-LET (Linear Energy Transfer = dE/dx) radiations the relationships cannot be strictly linear over the whole dose range and that even for high doses they are probably not linear.

The difficulty is that since we do not know the precise relationship - it is assumed as a matter of cautious procedure, that the dose-effect relationships are linear throughout the entire dose range. This assumption -- taken too literally -- may lead to unnecessary and unjustifiable restrictions on the use of ionizing radiation. From the mere fact that radiation may cause some identifiable effect, it does not follow that the effects are necessarily detrimental.

III. Non-Scientific Aspects Of Radiation Protection:
In the late 1940's it was clear to the NCRP (National Council on Radiation Protection and Measurements), and probably to other bodies, that non-scientific factors would be involved in permissible dose standards. Why are people willing to accept any risk at all? This argument applies to practically everything we do in life, with radiation being perhaps one of the smallest risks that we normally have to contend with.
The past supply of wisdom has come mostly from the scientists themselves, who consciously, or unconsciously, recognizing the limits of their knowledge, have made strong and important judgment actions regarding their knowledge and the amount of radiation considered to be acceptable for radiation workers or the public or the patient.

No one has been identifiably injured by radiation while working within the first numerical standards set by the NCRP and the ICRP (International Commission on Radiological Protection) in 1934. The theories about people being injured have still not led to the demonstration of injury and, if considered as facts by some, must only be looked upon as figments of the imagination.

a. Politics:
From about 1946 to 1977, practically all federal matters in the United States relating to ionizing radiation were handled through the Joint Committee on Atomic Energy. The joint committee, with a stable membership from both the House and the Senate, was dedicated to developing facts and an understanding of atomic energy, rather than looking for newspaper headlines and votes.

Now, in its place there are some two dozen congressional committees, lacking in stability and without an overview power. Rarely does the chairman or staff of these committees have any knowledge in depth of the broad subject of ionizing radiation.

In spite of technical shortcomings in the political arena, both federal and state legislatures exert strong influences on the development of numerical radiation protection standards. Because of the likely influence on governmental committees by vocal but prejudiced witnesses or witnesses having some personal case to plead, we are today faced with the possibility of unreasonably restrictive limitations being placed on legitimate uses of ionizing radiation.

b. The Media:
One of the first political needs we must always recognize in dealing with groups of people is education. The prime agents of education (outside of formal schools) in these times are the radio-television, newspapers, comic books, books generally and books written by scientists. Of these, the "news media" clearly dominate, and here lies one of our most critical problems and the most fruitful area in which the radiation protectionist must assist in the education of the public. First, however, we have to persuade the media (and I use the term rather broadly now) that they have a national obligation to assist the country in educating its public about radiation matters.

Attacks on the news media for one reason or another are common as is their own defense under the First Amendment. However, in my opinion, the First Amendment .. is an essential bulwark of freedom....[but] the First Amendment also carries with it an obligation on the part of the press to completely and properly report the news.
In the case of ionizing radiation .... there are constant and continuous violations of this principle... The fact remains that we need greater responsibility on the part of the news media in the objective presentation of uneditorialized news.

c. Laws and Regulations:
There are at least fourteen agencies of which six have regulatory responsibilities. Six have research and development responsibilities and three have advisory roles. In the legislative branch of the government, there may be some twenty-four House or Senate committees playing some role in radiation matters (the exact identification of these is not easy).

d. Economics:
There is constant pressure to lower protective standards by some radiation protectionists as well as "consumer advocates" and generally concerned members of the public. Too often their arguments are based mainly on theoretical arguments of effects that have never been observed .... So this is a case of reducing by some factor something that you did not know in the first place. If someone were today to decide on a reasonable de minimis level for radiation exposure, it would probably be found that most of our radiation installations are already well below it.

e. Education:
We need two things: (1) better communication within and between scientific and technical groups on the one hand, and the general public on the other; and (2) much broader education of information to the public. These communication and educational projects should be carried out basically by non-governmental organizations, aided and assisted, however, by some limited government support. In the matter of communication, the radiation protectionist profession must play a stronger role ...

It is my belief that much of the blame for the public's fears and apprehensions with respect to radiation matters are due to our media. There is another criticism that must be directed to the media, namely, their constant use of a small number of individuals who are clearly out of step with the radiation protection community. In the U.S. alone there are some 3500 health physicists and 1800 radiological physicists. Yet the media will, for some newly breaking news story, seek out some of a half a dozen individuals who are willing to make willfully deceptive statements regarding radiation.

f. Scare Books and Articles:
Of a collection of "popular" books published over the last decade or so dealing with radiation matters there is not a single one which is not riddled with half-truths, untruths, and evidence of basic lack of knowledge of nuclear energy or radiation. ... another insidious practice designed to keep the public alarmed about radiation matters .... the constant linkage made between the atomic bomb and any discussions about radiation, including medical and industrial applications.

I plead that we cease the seemingly endless procession of studies, congressional committees, and hearings on the problem of "low level ionizing radiation" .. About this we know what we know and we know what we do not know; there is reasonable and rational agreement as to the degree of disagreement. Either we forget the whole "problem" or we theorize or postulate a dose-effect relationship.

However, .... these technical concepts have been grasped by the press, by the congress, by some government agencies, and hence by the public as established facts, rather than as the scientific ruminations, which they are.
Somehow, we as radiation protectionists must develop an unassailable counter force against such misguided actions as outlined above. This counter force should act with such strength and integrity and persistence as to compel public attention and respect.

John Cameron is an emeritus professor at the University of Wisconsin

jrcamero@facstaff.wisc.edu

Can Low Level Radiation Protect Against Cancer?

Ludwig E. Feinendegen, Victor P. Bond, Charles A. Sondhaus.

Abstract
Effects of ionizing radiation in living tissues are generally viewed in terms of damage. However, low doses of ionizing radiation also cause changes in normal intra- and extracellular signaling, with potentially beneficial consequences.

Cellular signaling is triggered in part by reactive oxygen compounds and may result in the protection of cells against molecular damage, especially that to DNA. As shown in various rodent and human cell systems, radiation induced protective mechanisms serve the cell to defend against, repair, and remove damage to cells. Most of this damage comes from oxidative byproducts of normal metabolism. Only a relatively small fraction of the damage in tissues is caused by ionizing radiation, even at many times the dose from normal background radiation.

Protective responses in various cell systems increase initially with dose up to about 0.1 - 0.2 Gy and then decrease and disappear. These responses are analyzed here in microdosimetric terms. The result contradicts in principle the linear nothreshold (LNT) hypothesis for radiation carcinogenesis. In fact, the incidence of spontaneous cancer may decrease at radiation doses below about 0.1 - 0.2 Gy. This is compatible with epidemiological data.

Introduction
The risk RE of detrimental effects such as cancer induction from exposure of multicellular tissues to low doses of ionizing radiation is conventionally assumed to be proportional to the absorbed dose[21] D: R = alpha D (1) The expression in equation (1) formalizes the "linear nothreshold" (LNT) hypothesis. It states that even the smallest dose of radiation has a risk of causing cancer. The proportionality constant alpha has the dimension of inverse dose, Gy^-1 and is assumed to be of the order of 10^-4 Gy^-1 for cancer in human tissues exposed to low LET radiation. The average dose from background radiation at sea level is about 2 mGy per year.
This paper analyses various effects of ionizing radiation in living tissues [15], and uses published data on low dose responses of tissues and cells in rodents and humans. The conclusions of this analysis contradict the LNT hypothesis. Living tissue is a nonlinear dynamic system, for which it would indeed be strange if such a simple expression of linearity were true over several orders of magnitude of dose.

Absorbed Dose to Tissue (Macrodose) and to Cells (Microdose)
The absorbed dose D expresses energy/mass. In a macroscopic sense, the SI unit of dose is the gray, Gy, where 1 Gy = 1 J/kg. It has the convenience of being easily measured. However, at low values this macroscopic dose is inappropriate for estimating risk at the cellular level where cancer is induced. Thus, absorbed doses to individual cells, or microdoses, need to be considered.

Ionizing radiation is absorbed in matter by the deposition of discrete amounts of energy along the tracks of charged particles that arise throughout the exposed matter [22]. The amount of energy absorbed from such tracks in a defined microvolume of tissue is here called a microdose. The microvolume of tissue is here taken to have the mass of a typical mammalian cell, 1 ng [3, 8, 12, 16, 28]. The mean microdose from a single interaction is calculated from the frequency of track events, or hits, in the defined micromass. Its value, z1, also called mean specific energy [22], hit-size or cell-dose, is a constant for a given type of radiation. For example, the z1 from 100 kV x-rays is about 1 mGy. According to microdosimetry theory [22], the tissue-absorbed dose D is equal to the product of z1 and N_H/N_e with N_H being the number of hits of any size, and N_e the number of exposed micromasses:

D = z1 (N_H/N_e) (2)

For the same macrodose from any type of radiation, N_H/N_e and z1 are reciprocally related. That is, for as given dose D of high LET radiation the average microdose z1 is proportionately greater and there are fewer hits for the same D than from low LET radiation. Substituting equation (2) into equation (1) gives

RE = alpha . z1 . (N_H/N_e),

and with RE being equal to the measured number of cancerous tumors N_q induced in the exposed tissue composed of _e cell-equivalent micromasses:[4]

N_q/N_e= alpha z1 (N_H/N_e)

This is known as the hit-number-effectiveness-function [4]. Multiplying each side of this equation by_ Ne yields:

N_q = alpha . z1 . N_H. (3)

Note that if the LNT hypothesis is correct, the proportionality constant alpha has the same value from equation 1 to 3.

The Meaning of Dose Rate
It has long been known that effects in irradiated living tissues depend on the rate at which the dose is absorbed. In microdose terms, the macrodose rate, Gy/unit time, allows one to determine how often on the average an individual cell is hit by a single average track event. The calculation [17] uses equation (2). Thus, D per unit time t is:

D/t = z1 (N_H/N_e) 1/t

or D/t = z1 /(t N_e/N_H)

The denominator (t N_e/N_H) defines the average time interval between two consecutive hits per individual cell. This time interval also gives the average time available for repair of the cell prior to a second hit. This paper only considers short term exposures. A dose rate of 2 mGy per year from 100 kV x-rays would bring a hit to each individual cell in a 70 kg human on average once every 6 months. This may be viewed as two short term exposures of individual cells, 6 months apart. Since z1 from 100 kV x-rays is about 1 mGy, 2 mGy/y = 1 mGy/(y . 0.5)

Cellular Responses to Microdoses
The response of living tissues to ionizing radiation in the low dose region involves various cellular responses [15] that are triggered mainly by hits in cells. Hit cells may also cause responses in non-hit cells by releasing signal substances and toxic, so-called clastogenic factors [9]. Moreover, hits in the extracellular matrix of the tissue may form products that interact with the cells [6] Indeed, the extracellular matrix may even reduce the natural incidence of cancer [2]. The response of tissues to low level radiation is, naturally, affected by tissue complexity.

The quantification of cellular responses to hits in tissue is important for evaluating tissue risk at low doses. Cellular responses in tissue can be observed even after single hits per cell from low LET radiation [33]. These responses may have the following consequences [15]:

a) Induction of damage:
Radiation may change the structure of DNA and cause cancer. This is a major concern in radioprotection. High doses of ionizing radiation are known to increase the probability of cancer in the exposed tissues. The greater the dose, the larger is the risk. At low doses, less than 0.2 Gy of low LET radiation, no statistically significant increase of cancer is seen in mammalian tissues [35, 36]. 0.2 Gy corresponds to about 100 times the annual dose from background radiation at sea level and still causes about a 100 times less damage to the DNA than does normal cell metaboli[31].

The probability per average hit of inducing a cell in the exposed tissue to cause cancer is assigned the term p_ind. High-dose and high dose rate exposure exceeding about 0.3 Gy induces human leukemia proportionally to dose; at this high dose, p_ind is estimated to be approximately 10^-14 per average hit from low LET radiation in human blood forming stem cells [15]. Interestingly, p_ind per average hit in tissue culture cells [19] is about 10^-5 Thus, the difference between the two p_ind is a billion and is not readily explained only by the difference between the cellular mechanisms of malignant transformation in tissue and in culture.

In this paper, p_ind is assumed to be constant over a wide range of doses from low LET radiation. If an enhancement of _ind were to result from an increase in the number of hits, then the enhanced probability of cancer per hit would then become p_ind (1+p_enh). For low LET radiation, p_enh per hit average is assumed to be zero.

b) Activation of damage control:
Low doses below about 0.2 Gy of low LET radiation affect physiological signaling within and between cells. This effect is due to chemical substances involving oxygen-containing radical compounds that are produced in the irradiated tissue. Altered signaling may change cellular metabolism in a variety of ways and trigger what can be viewed either as physiological responses of complex systems to toxic agents [10], or as adaptive responses [30]. They protect cells against damage to DNA and other biologically important molecules in the exposed tissue, irrespective of the cause of such damage. These protective responses may be grouped into four categories:

  1. Damage prevention by temporarily stimulated detoxification of molecular radical species [11, 13, 14, 20]. This temporary protection of cellular constituents against toxic oxygen-containing radicals was found in mouse bone marrow at a maximum at 4 hours after short-term, i.e., acute, exposure to low doses of y-radiation; the degree of protection increased with dose up to about 0.1 Gy and then disappeared as a function of dose exceeding about -0.1 0.2[17,18].
  2. Damage repair by temporary stimulation of repair mechanisms. Low dose x-irradiation stimulated, i.e., conditioned the reduction of chromosomal aberrations that occur in cultures of human lymphocytes following large, so-called challenging doses [5, 30, 31, 33, 34, 38]. The degree of protection varied from zero to a maximum with individual donors of these cells. Protection was seen when the challenging dose of 2 Gy was given between about 4 and 70 hours after the conditioning dose of 0.005 - 0.01 Gy. It was not seen when the conditioning dose of 0.01 Gy was given at the very low dose rate of 0.0064 Gy/minute, or when the conditioning dose exceeded 0.1 Gy [32, 33], or when the challenging dose was 4 Gy instead of 2y [34].
  3. Damage removal by induction of apoptosis [23, 29]. Apoptosis is cell death in response mainly to DNA alterations; it is triggered by intracellular signals and eliminates damaged cells from tissue. At low doses of x-radiation, removal of damaged cells outweighed the induction of tissue damage from lost cells [23]. In one study, the incidence of this protective cell death in cultures of human lymphocytes rose up to day 4 after exposure to low LET radiation; it appeared linear with dose between about 0.1 and 2 Gy with a slope of 0.08 per 0.1 Gy [27].
  4. Damage removal by stimulated immune response [1, 26]. Cells of the immune system in rodents responded by stimulated production of Tcells during fractionated gamma-irradiation with 0.01 - 0.04 Gy per day for a total of 20 days [26]. The maximum response to acute whole-body gamma-irradiation was at doses between 0.1 - 0.3 Gy. This reponse improved surveillance of damaged cells over periods of weeks, and eliminated cancerous cells [1, 26]. An improved immune response may result in an increased resistance to common infections and prolong life [37].

The four protective responses all activate cell damage control and are here summed up by a cumulative probability p_prot per average hit. Damage to cells results mainly from oxidative byproducts of normal metabolism and not from low dose irradiation[31]. Thus, radiation-induced protection acts mainly by preventing such damage. Also, existing damage may be eliminated by protective responses. Because protection may be mediated by way of intercellular signaling and by the extracellular matrix, nonirradiated cells may also benefit.

Since DNA damage may induce cancer [7, 37], the radiation-induced protective responses may also protect against spontaneously occurring cancer [37]. The probability of a spontaneous malignant transformation per cell is denoted here by p_spo. In the human blood forming stem cells, about 10^9 in the adult, p_spo per cell causing leukemia is approx^-11 throughout life [17].

As stated above for the human blood forming stem cell, p_ind per average hit is about 10^-14. Hence, the ratio p_spo/p_ind is about 10^3. This is a factor of ten smaller than the ratio of metabolically produced DNA damage and that due to background radiation per cell per day [31].

Regardless of the mechanisms involved, the values of p_spo, p_ind and p_prot are likely to vary with the organism, cell type, and z 1. This emphasizes the principal difficulty of predicting the risk of cancer for an individual from a given dose of radiation. Nevertheless, in contrast to _ind, in different cell systems p_prot decreased when D exceeded about 0.1 to 0.2 Gy of low LET radiation, as was discussed above [17,32,33,34].

The observed detrimental and protective responses in the exposed tissue operate transiently over different time spans. Thus, in order to compare the various p-values numerically, their averages need to be established and normalized to their time of duration. Moreover, a radiation-induced malignant transformation in a single cell may eventually cause a tumor to develop only over a period of perhaps several decades and following a series of subsequent DNA mutations. Therefore, in order to offset one randomly occurring spontaneous transformation leading to later tumor development, either the respective cell must experience a temporary protective response often or an accordingly large number of respective cells must each be temporarily protected once.

Tissue Response as the Sum of Microdose Effects
The preceding section indicates that low doses of low LET radiation that result in single or few hits per cell may initiate cancer and also produce various protective effects against both spontaneous and radiation induced cancer. The following probabilities per cell are here defined:

p_spo = spontaneous malignant transformation throughout life,

p_ind = radiation induced malignant transformation per average hit,

p_enh = enhancement of p_ind per average hit,

p_prot = protection that will prevent cancer from developing in the exposed tissue, per average hit.

The total probability per average hit of causing a cancer in the exposed tissue can then be expressed as [18]:

N_q/N_H = [p_ind + p_ind p_enh - p_prot p_spo - p_prot p_ind- p_prot p_ind p_enh] (4)

With N_q/N_H = alpha . z1 (see equation 3)

and rearranging equation (4) :

alpha = [p_ind (1 + p_enh)- p_prot(p_spo+p_ind+p_ind p_enh)](1/z1) (5)

In Figure 1, equation (5) is abbreviated to

alpha = [{P_ind} - {P_prot}] (1/z1) (6)

with { P_ind} = p_ind (1 + p_enh) ; and {P_prot} = p_prot(p_spo+p_ind+p_ind p_enh).

For low LET radiation, p_enh is taken to be zero. {P_prot} increases up to a dose of about 0.1 - 0.2 Gy. At higher doses it decreases gradually to zero. Since the protective term goes through a maximum in the low dose range, it is impossible for alpha to be a constant as required for the LNT hypothesis [15, 17, 1].

The various p-values estimated above may be used for a crude approximation of the probability of radiation induced leukemia. Thus, the value of p_ind for blood forming stem cells is about 10^-14 per hit average of low LET radiation [15]; p_enh is considered zero for low-LET radiation; p_spo for the same cell type has been estimated to be about 10^-11 throughout life [17]; thus, the value of the negative term in alpha, p_prot (p_spo + p_ind +p_ind p_enh) would become equal to the value of the positive term p_ind (1 + p_enh) if p_prot were 10^-3 at some dose of low-LET radiation. At this dose, there would be no increase in leukemia and a threshold would exist. With higher values of p_prot the leukemia incidence would be lower than in the control population; conversely, RE in equation (1) would only increase with D, as p_prot decreases following a maximum (see Fig. 1).

Thus, with regard to radiation-induced cancer, at least two different dose effect curves must be considered. According to equation (6) and as seen in Figure 1, the sum of these dose effect curves generates the net dose effect curve. In Figure 1, D is the product of N_H and the constant z1/N_e making N_H = c' . D. On the vertical axis is the increase (+M) or decrease (-M) of cancers in the exposed tissue compared to the background cancer incidence. Most of these background cancers are due to DNA damage resulting from normal cellular metabolism.

Components of p_prot are easily measured at low doses in various cell systems whereas _ind is not [37]. If cancer is induced at low doses, i.e., below about 0.2 Gy, it is lost in the statistical noise of spontaneous cancer incidence. Indeed, a lack of statistically significant changes in N_q after exposure of mammalian tissues to low LET radiation below 0.2 Gy makes it impossible to determine whether detrimental, beneficial, or no effects occur [36, 37]. However, epidemiological and many experimental data support the existence of a threshold or even beneficial (hormetic) effects at low D and low LET radiat[24,25].

Regarding the term alpha for high LET radiation, the relatively high value of z1 may cause p_prot for the exposed cells to be so small as to be ineffective. However, p_ind, p_ind p_enh and p_spo might be offset by a larger p_prot if it is initiated in non-hit cells by intercellular signal substances and specific toxic factors stemming from irradiated cells. Stimuli that are induced by high LET radiation in the extracellular matrix might also affect nonhit cells.

Conclusion
For assessing the probability of radiation induced cancer at low doses, the absorbed dose of importance is that to individual micromasses, the microdose. The macrodose, D, has limited applicability in estimating cancer risk at low doses. The macrodose from a given type of ionizing radiation in tissue is equal to the product of the number of microdose events, or hits, N_H, per number of exposed micromasses, N_e and the average energy absorbed per hit micromass, z1, i.e., the mean microdose for that radiation. For the calculation of z1 for any type of radiation, the micromass in tissue is taken to be 1 ng and corresponds to the mass of a typical cell in mammalian tissue. For the same macrodose from any type of radiation, N_H/N_e and z1 are reciprocally related. That is, for a dose D for high LET radiation, the average microdose z1 is proportionately greater with fewer hits than for the same D of low LET radiation.

The structure and function of tissues are determined by cells, the elemental units of life. Cells respond to hits. Adjoining cells may be affected by signal substances from hit cells and irradiated extracellular matrix.

One type of response to low LET irradiation initiates cellular damage, especially to DNA. Other responses protect against or remove DNA damage by way of stimulating existing cellular mechanisms for DNA damage control. The mis- or unrepaired DNA damage that is constantly caused by oxidative by-products of normal metabolism is estimated to be about 10,000 times greater than DNA damage from background radiation [31] Mis- or unrepaired DNA damage may cause cancer. It follows that the protection produced by hit cells mainly operates against spontaneous cancer from normal metabolism.

Low dose of low LET irradiation of different mammalian cell systems shows that the protective mechanisms increase initially with dose up to about 0.1 Gy. As the dose increases above 0.1 0.2 Gy, the protective action decreases and eventually disappears. The biphasic response of the protective action of radiation is in marked contrast to the assumed linear increase of cancer with low doses predicted by the LNT hypothesis. The incidence of radiation induced cancer may be zero at low doses and the spontaneous cancer incidence may even fall in the dose region around about 0.1 Gy. Many epidemiological and experimental data support this conclusion.

References

  1. Anderson R.E. Effects of low-dose radiation on the immune response in: Biological effects of low level exposures to chemicals and radiation. Ed. E.J. Calabrese. Lewis Pub. Inc. Chelsea, Michigan. 1992, p.95-112
  2. Bissel M.J. personal communication, 1996
  3. Bond V.P., Feinendegen L.E. Intranuclear 3-H thymidine: Dosimetric, radiobiological and radioprotection aspects. Health Phys. 12: 1007-1020 (1966)
  4. Bond V.P., Benary V., Sondhaus C.A., Feinendegen L.E. The meaning of linear dose-response relations, made evident by use of absorbed dose to the cell. Health Phys. 68: 8B6-792 (1995)
  5. Bosi A., Olivieri G. Variability of the adaptive response to ionizing radiations in humans. Mutation Res. 211: 13-17 (1989).
  6. Boudreau, N., Werb Z., Bissel M.J. Suppression of apoptosis by basement membrane requires threedimensional tissue organization and withdrawal from the cell cycle. Proc.Natl.Acad.Sci.USA 93: 3509-3513 (1996)
  7. Cleaver J. Defective repair replication of DNA in Xeroderma Pigmentosum. Nature (London) 21B: 652-656 (1968).
  8. B. Cronkite E.P., Robertson J.S., Feinendegen L.E. Somatic and teratogenic effects of tritium. in: Tritium. Moghissi A.A., Carter M.W. eds. Messenger Graphics, Phoenix AZ, 198-209 (1971)
  9. Emerit I., Oganesian N., Sarkisian T., Arutyunyan R., Pogosian A., Asrian K., Levy A., Cernjavski L. Clastogenic factors in the plasma of Chernobyl accident recovery workers: Anticlastogenic effect of Gingko Biloba extract. Radiation Res. 144: 19B-205 (1995)
  10. Feinendegen L.E., Muehlensiepen H., Lindberg C., Marx J., Porschen W., Booz J. Acute effect of very low dose in mouse bone marrow cells: a physiological response to background radiation? in: Biological Effects of Low Level Radiation, IAEA Vienna 459- 471 (1983).
  11. Feinendegen L.E., Muehlensiepen H., Lindberg C., Marx J., Porschen W., Booz J. Acute and temporary inhibition of thymidine kinase in mouse bone marrow cells after low-dose exposure. Int.J.Radiat.Biol. 45: 205-215 (1984).
  12. Feinendegen L.E., Booz J., Bond V.P., Sondhaus C.A. Microdosimetric approach to the analysis of cell responses at low dose and low dose rate. Radiat.Protect.Dosimetry 13: 299-306 (1985).
  13. Feinendegen L.E., Muehlensiepen H., Bond V.P., Sondhaus C.A. Intracellular stimulation of biochemical control mechanisms by low-dose low-LET irradiation. Health Phys. 52: 663-669 (1987).
  14. Feinendegen L.E., Bond V.P., Booz J., Muehlensiepen H. Biochemical and cellular mechanisms of low-dose effects. Int.J.Radiat.Biol. 53: 23-37 (1988).
  15. Feinendegen L.E. Radiation risk of tissue late effect, a net consequence of probabilities of various cellular responses. Europ.J.Nucl.Med. 1B: 740-751 (1991)
  16. Feinendegen L.E., Bond V.P., Booz J. The quantification of physical events within tissue at low levels of exposure to ionizing radiation. ICRU-News 2: 9-13 (1994)
  17. Feinendegen L.E., Loken M., Booz J., Muehlensiepen H., Sondhaus C.A., Bond V.P. Cellular mechanisms of protection and repair induced by radiation exposure and their consequences for cell system responses. Stem Cells 13(1): 7-20 (1995)
  18. Feinendegen L.E., Bond, V.P., Sondhaus C.A., Muehlensiepen H. Radiation effects induced by low doses in complex tissue and their relation to cellular adaptive responses. Mutation Res. 35B: 199-205 (1996)
  19. Hall E.J., Hei T.K. Oncogenic transformation in vitro by radiation of varying LET. Radiat.Protect.Dosimetry 13: 149-151 (1985).
  20. Hohn-El-Karim K., Muehlensiepen H., Altman K.I., Feinendegen L.E. Modification of effects of radiation on thymidine kinase. Intern.J.Radiat.Biol. SB: 97-110 (1990)
  21. ICRU (International Commission on Radiation Units and Measurements) Radiation Quantities and Units. ICRU, Bethesda, MD, Report 33 (1980).
  22. ICRU (International Commission on Radiation Units and Measurements) Microdosimetry. ICRU, Bethesda, MD, Report 36 (1983).
  23. Kondo S. Health Effects of Low Level Radiation. Kinki Univ.Press, Osaka, Japan; Medical Physics Publishing, Madison, WI; 1993.
  24. Loken M.K., Feinendegen, L.E. Radiation hormesis, its emerging significance in medical practice. Invest.Radiol. 2B: 446-450 (1993)
  25. Luckey T.D. Physiological benefits from low level of radiation exposure. Health Phys. 43: 771-7B5 (1982).
  26. Makinodan T. Cellular and subcellular alteration in immune cells induced by chronic, intermittent exposure in vivo to very low dose of ionizing radiation (ldr) and its ameliorating effects on progression of autoimmune disease and mammary tumor growth. In: Low-Dose Irradiation and Biological Defense Mechanisms; Amsterdam; London; New York; Tokyo. Exerpta Medica 1992, 233-237
  27. Menz R.C. An assay for biological dosimetry based on induction of apoptosis in human T-lymphocytes. Thesis, St. Bartholomew's and the Royal London School of Medicine and Dentistry, University of London, 1996.
  28. NCRP (National Council on Radiation Protection) . Tritium and Other Radionuclide Labeled Organic Compounds Incorporated in Genetic Materials. NCRP, Bethesda MD, Publication 63 (1979)
  29. Norimura T., Nomoto S., Katsuki M., Gondo Y., Kondo S. p-5 3-dependent apoptosis suppresses radiation-induced teratogenesis. Nat.Med. 2: 577-580 (1996)
  30. Olivieri G., Bodycote J., Wolff S. Adaptive response of human lymphocytes to low concentrations of radioactive thymidine. Science 223: 594-597 (1984)
  31. Pollycove M., Feinendegen, L.E. Quantification of human total and mis/unrepaired DNA alterations: intrinsic and radiation induced. To be published
  32. Shadley J.D., Wolff S. Very low doses of X-rays can induce human lymphocytes to become less susceptible to ionizing radiation. Mutagenesis 2: 95-96 (1987).
  33. Shadley J.D., Afzal V., Wolff S. Characterization of the adaptive response to ionizing radiation induced by low doses of X-rays to human lymphocytes. Radiation Res. 111: 511-517 (1987)
  34. Shadley J.D., Wienke, J.K. Induction of the adaptive response by X-rays is dependent on radiation intensity. Int.J.Radiat.Biol. 56: 107-liB (1989).
  35. Shadley J.D., Dai G. Cytogenetic and survival adaptive responses in G phase human lymphocytes. Mutation Res. 265: 273-2B1 (1992)
  36. UNSCEAR. Ionizing Radiation: Sources and Biological Effects. United Nations, New York, N.Y.: 19BB.
  37. UNSCEAR. Sources and Effect of Ionizing Radiation. United Nations, New York, N.Y.: 1994
  38. Wei Q., Matanoski G.M., Farmer E.R., Hedayati M.A., Grossman L. DNA repair and aging in basal cell carcinoma: A molecular epidemiologic study. Proc.Natl.Acad.Sci.USA 90: 1614-161B (1993)
  39. Wolff S., Afzal V., Wienke J.K., Olivieri G., Michaeli A. Human lymphocytes exposed to low doses of ionizing radiations become refractory to high doses of radiation as well as to chemical mutagens that induce double-strand breaks in DNA. Int.J.Radiat.Biol. 53(1): 39-49 (1988).
  40. Zamboglou N., Porschen W., Muehlensiepen H., Booz J., Feinendegen L.E. Low dose effect of ionizing radiation on incorporation of iododeoxyuridine into bone marrow cells. Int.J.Radiat.Biol. 39: 83-93 (1981).

The Rise and Fall of the Linear No-Threshold (LNT)

Theory of Radiation Carcinogenesis

Myron Pollycove, M.D.

Physics, together with its sister Chemistry and daughter Biology, furnish knowledge of the laws of Nature. The welfare of society depends upon a harmonious interaction between the natural laws governing our environment and physical body and human actions of conscience and integrity. I fully believe in the Hippocratic Oath of the physician to act "for the benefit of my patients, and abstain from whatever is deleterious." Growing together with Nuclear Medicine since 1953, I was concerned with radiation's health effects on our patients and staff. We held to the conservative threshold limits of the Atomic Energy Commission. Later, we adhered strictly to further reductions of exposures to "as low as reasonably achievable," ALARA. The latter reductions were associated with the Linear No-Threshold (LNT) theory that all radiation doses, even those close to zero, are harmful. In other words, Low doses are held to have the same effects as high doses, but with lower incidence.

Fully involved with clinical research, teaching, and the diagnosis and treatment of patients in both Nuclear Medicine and the Clinical Laboratory, it never occurred to us to question radiation regulations. These regulations are based upon recommendations of International and National Radiation Protection Committees composed of eminent radiation science specialists. Nevertheless, after 35 years of complete trustful acceptance of radiation protection policy, in the late 80's and 90's peer reviewed publications and conferences began to present data that were incompatible with LNT theory.

Upon retirement, I accepted the position of Visiting Medical Fellow with the US Nuclear Regulatory Commission. I began a careful examination of some published epidemiologic low-dose radiation studies. No statistically significant low-dose radiation study (<20cGy) was found to support the LNT theory of carcinogenesis and mortality risk. This was confirmed by the National Council of Radiation Protection and Measurements (NCRP) Report 121 (11/30/95) that summarizes the current status of LNT theory:1

"...essentially no human data, can be said to prove or even to provide direct support for the concept of collective dose with its implicit uncertainties of nonthreshold, linearity and dose-rate independence with respect to risk. The best that can be said is that most studies do not provide quantitative data that, with statistical significance, contradict the concept of collective dose...

Ultimately, confidence in the linear no threshold dose-response relationship at low doses is based on our understanding of the basic mechanisms involved. ...[Cancer] could result from the passage of a single charged particle, causing damage to DNA that could be expressed as a mutation or small deletion. It is a result of this type of reasoning that a linear no-threshold dose-response relationship cannot be excluded. It is this presumption, based on biophysical concepts, which provides a basis for the use of collective dose in radiation protection activities".

Dr. Feinendegen has presented microdosimetric evidence of cell and tissue low-dose stimulation of the DNA damage control biosystem. This stimulation is confirmed at the level of the organism as well as the cell by the 1994 report of UNSCEAR. Why, then, aren't we aware of corresponding beneficial effects in humans who have been exposed to low-dose radiation? Regrettably, presentation of this data has been suppressed, deleted, discounted as unreasonable, and unscientifically criticized as implausible or invalid. Concurrently, efforts to present low-dose data that support the LNT theory have led to misrepresentation of their data by authors of four studies:

  • The 1989 Canadian Fluoroscopy Study2 discards the most statistically significant data demonstrating large decreases of breast cancer mortality at 0.15Gy and 0.25Gy cumulative exposures. The study retained insignificant higher dose data so that "The best fit for the data was provided by the linear model..."
  • The 1996 revision of the Canadian Fluoroscopy Study3 states that since low-dose data is uninformative, it is necessary to extrapolate from high-dose data. The authors then lumped together 5 dose categories to form a single 0.01-0.49 Gy dose category.
  • The 1995 Cardis, et al. Study of Nuclear Industry Workers in three Countries4 reports that non-chronic lymphocytic leukemia was significantly associated with chronic low-dose occupational exposure. The authors apply one-sided methodology to their 7 dose categories with a total of 119 deaths in order to discard 86 deaths in the 4 dose categories with fewer observed leukemia deaths than expected. A computer simulation of 5000 deaths was then used to simulate statistical significance for the remaining 33 deaths in the 3 dose categories selected.
  • The 1996 RERF Life Span Study Report 12.5 This report was used in November 1996 to mobilize support for the LNT theory. The International Commission on Radiation Protection (ICRP) under Chairman Roger Clarke and the French Society for Radioprotection reviewed this Life Span Study of Atomic Bomb Survivors which includes the 1985-1990 mortality data.5.6 The ICRP claimed that analysis of this new data shows a statistically significant increased solid cancer mortality at doses as low as 5 cSv. According to Warren Sinclair, President Emeritus of the NCRP and Chairman of the ICRP Committee 1 which analyzes results of health-effects studies, the new results "vindicate" previous recommendations to lower permissible dose limits. "The combination of more data points and a more precise analysis," Sinclair said," allowed the RERF researchers to state with confidence that excess cancer risk due to radiation was observed at doses as low as 50 mSv."6 The "more precise analysis" does not use the observed excess solid cancer deaths but substitutes estimated excess deaths derived from a model fit that assumes linearity.
  • The report omits statistical analysis of the observed excess solid cancer deaths following exposures of 5 rem (P=0.25) and 15 rem (P=0.56) that demonstrates they are not statistically significant; the lowest significant dose for increased solid cancer mortality is 35 rem (P=0.03). The correct dose for this increased mortality is considerably greater than 35 rem. The revised DS86 dosimetry used gives estimates for neutron radiation from the Hiroshima atomic bomb that are lower by an order of magnitude than both the original T65D dosimetry and the experimental values obtained from neutron activation measurements at the distances from the hypocenter that correspond to low-dose exposures.7

While no statistically significant data support the assumption of monotonic increased risk of cancer with increased low-dose radiation, in recent decades a considerable body of contradictory scientific epidemiologic data has accumulated.

Increased longevity and decreased cancer death rates have been observed in populations of the U.S., China, India, Austria, and the United Kingdom exposed to high natural background radiation. Several recent epidemiologic studies with high statistical significance have reported that exposure to low or intermediate levels of radiation are associated with decreased mortality and/or decreased incidence of cancer:

  • Cancer Mortality in an Irradiated Eastern Urals Population (1994)8 This study reports statistically significant 28% and 39% decreases of cancer mortality in the 50cSv and 12cSv dose groups.
  • Atomic Bomb Survivor Mortality from All Causes (1993)7 Longevity is significantly greater in the exposed survivors than in the unexposed.

University of Pittsburgh Residential Radon Study (1995)9

A comprehensive survey that includes the effect of smoking and more than 60 other confounding factors, analyzes 89% of the U.S. population, many exposed to high residential radon concentrations, shows with very high statistical significance, the strong tendency for lung cancer mortality to decrease as radon exposures increase, in sharp contrast to the increasing mortality expected from the LNT theory.
U.S. Nuclear Shipyard Worker Study (1991)

The UN Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) 199411 reports, "The statistically significant decrease in standardized mortality ratio for deaths from all causes [0.76+0.015] cannot be due to the healthy worker effect alone, since the non-nuclear workers and the nuclear workers were similarly selected for employment and were afforded the same health care thereafter." "The type of work carried out by the three groups was identical, except that the nuclear workers were exposed additionally to 60Co gamma-radiation."11

The Canadian Fluoroscopy Study (1989)2

Breast cancer mortality is statistically significantly decreased to 0.66 in women exposed to cumulative doses of 10-20 cGy and is decreased to 0.84 in women exposed in the 20-30 cGy dose range.
Despite almost 40 years of intensive search, the LNT theory is not supported by any statistically significant quantitative low-dose (e.g.<20 cGy) data. On the other hand, this "presumption, based on biophysical concepts," is contradicted by the emergence during the past two decades of significant data demonstrating risk decrements in response to low-dose radiation exposures. Risk increments in response to high doses (e.g., > 1Gy) are well documented. The matter is clearly more complex than a simplistic biophysical presumption of linearity. These observations require careful realistic scientific and public policy discussion based upon current epidemiology and molecular biology.

The complex cell circuitry signaling for growth, division, and death includes both extracellular factors and transcription factors. "...the extraordinary detail and duplicate functions of these circuits are designed so that single disruptions here and there do not create malignant growth. A cell divides without restraint only when its circuitry has been disrupted at a number of key points: multiple mutations are required."12 Intrinsic metabolic mutations occur with very high frequency. "...by fundamental limitations on the accuracy of DNA replication and repair, ...in a lifetime, every single gene is likely to have undergone mutation on about 1010 separate occasions in any individual human being..."13 The additional relentless continual damage of DNA by reactive oxygen metabolites (02 , ^OH, H2O2), comprising 2-3 percent of all oxygen consumed, and thermal instability, increases this number to more than 1011 mutations per gene.14.15

"From this point of view, the problem of cancer seems to be not why it occurs, but why it occurs so infrequently. Evidently, ...if a single mutation in some particular gene were enough to convert a typical healthy cell into a cancer cell, we would not be viable organisms."13

Our survival depends on effective defense mechanisms that prevent (anti-oxidants, cell cycle control) and repair (DNA repair enzymes) DNA damage, and remove about 102 mis/unrepaired DNA alterations/cell daily by cell cycle diierentiation, programmed cell death (apoptosis), necrosis, and the immune system (Figure 3).11.14.15 Low dose radiation stimulates and increases the effectiveness of this DNA damage control biosystem.

The progressive accumulation of metabolic mutations and an age-related decline of biosystem effectiveness is associated with an exponential increase in the incidence of cancer with the third to the fifth power of age.13.16-20 The low incidence of cancer under the age of 50 is usually associated with genetic defects of the biosystem controlling DNA damage.

A whole body radiation background of 1mGy/year would produce about 10-7 mutations/stem cell/day.15.21 Exposure to 20cGy/year would produce 2x10-5 mutations/stem cell/day. Though this is insignificant compared to the intrinsic metabolic background of ;1 mutation/stem cell/day,15 a very small linear incremental risk of cancer would result theoretically, assuming that the effectiveness of the biosystem controlling DNA damage remains constant. During the past 15 years studies have shown that biosystem control of DNA damage does not remain constant, but adaptively responds with beneficial increased activity to low-dose (e.g.,<20cGy) radiation as well as to low-dose toxic chemical agents11.16.22. As the dose is increased to high dose (e.g.,>1Gy) radiation levels, the DNA damage control biosystem is progressively suppressed and fails.

LNT theory applied to the risk of cancer is based on two assumptions: 1) the biological response of cancer to radiation dose monotonically increases, and 2) all mutations, whether induced by ionizing radiation or other agents, produce a corresponding increase in the risk of cancer, assuming the fraction of DNA damage repaired is constant with dose. These assumptions are not valid. They are contradicted, with no support, by all statistically significant low-dose epidemiologic data and they ignore the operative effect of ionizing radiation on the DNA damage control biosystem. Emphasis is placed on the relative difficulty of repairing infrequent double strand breaks (0.4/cell/cSv low-LET radiation),21 while ignoring the daily removal and control of the unrepaired breaks, together with trillions of other metabolic misrepaired or unrepaired DNA alternations, by the adaptive responses of differentiation, self programmed cell death (apoptosis), necrosis, and the immune system. Disregarded are the extremely high background of spontaneous metabolic mutations and the adaptive responses to radiation that, until they diminish with age, very effectively prevent, repair, and remove both the spontaneous and the relatively few low-dose, low-dose-rate environmental mutations.

Contrary to the increased risks associated with injury to the DNA damage-control biosystem by high-dose radiation, this biosystem is stimulated by low-dose radiation to function even more effectively and decrease the risks of mortality and cancer. These observations of fundamental biologic cellular functions and corresponding epidemiologic studies contradict the theoretical assumptions based on biophysical concepts and exclude a LNT dose-response relationship.

Nevertheless, since 1959 the LNT theory has remained the basic principle of all radiation protection policy. This theory is used to generate collective dose calculations of the number of deaths produced by background radiation. The increase of public fear through repeated statements of deaths caused by "deadly" radiation has engendered an enormous increase in expenditures now required to protect the public from all applications of nuclear technology: medical, research, energy, disposal, and cleanup remediation. These funds are allocated to appointed committees, the research they support, and to multiple environmental and regulatory agencies. The LNT theory and multibillion dollar radiation activities have now become a symbiotic self-sustaining powerful political and economic force.

Scientific understanding of the positive health effects produced by adaptive responses to low-level radiation would result in a realistic assessment of the environmental risk of radiation. Instead of adhering to non-scientific influences on radiation protection standards and practice23 that impair health care, research, and other benefits of nuclear technology, and waste many billions of dollars annually for protection against theoretical risks, these resources could be used productively for effective health measures and many other benefits to society.

Myron Pollycove is a Visiting Medical Fellow at the U.S. Nuclear Regulatory Commission and Professor Emeritus of the Laboratory Medicine and Radiology at the University of California in San Francisco.

A Partial Solution to LLW Siting?

John F. Ahearne

"...a radioactive waste dump...will endanger the Colorado River and the people, animals and vegetation which depend upon it -- all the way to Los Angeles."

"What's at stake? Nothing less than the drinking water supply of 12 million Southern Californians...."

"..it is highly unlikely that significant amounts of radioactive material...Will reach the ground water...and an even smaller chance of reaching the Colorado River."

These widely different statements do refer to the same site, and do not refer to high level radioactive waste. They refer to the proposed low-level radioactive waste (LLW) site in Ward Valley, in the Mojave Desert. The first two statements are from the Executive Director of Greenpeace and Sen. Barbara Boxer . The last is from the Chair of the National Research Council Ward Valley Committee.

While some of the debate about radioactive waste has focused on geologic repositories proposed for New Mexico (the Waste Isolation Project Plant, WIPP, for transuranic waste, TRU) or Nevada (the Yucca Mountain repository for high level waste, HLW), more prevalent are arguments in several states about siting low level waste sites. Low level waste is different from that destined for WIPP and Yucca Mountain in several aspects. First, the responsibility for disposal of commercial LLW rests with the states, not with the federal government. The Department of Energy (DOE) does have a substantial amount of LLW generated by defense and DOE research, but it has been and is planned to continue to be disposed of on DOE sites. Second, unlike HLW and TRU, LLW was thought to be relatively easy to handle. This has turned out to be a monumental misjudgment.

Six commercial sites were developed to receive LLW. By the end of the 1970's, three had been closed because of environmental problems, primarily concerns about the possibility of waste leaking off site. The governors of the states with the remaining three sites became concerned that no new sites were being developed but they must allow waste to come in from any other state. This led to an initiative in Congress in 1978 for the federal government to take over responsibility for siting new disposal facilities. The National Governors' Association opposed federal control and the Waste Policy Act of 1980 retained state control. This Act established a process by which groups of states could band together in "compacts" which could restrict waste disposal to members of the compact.

The compacts were to be in place by 1986, when states could begin barring waste generated outside the compact regions. In 1985, with no new sites being developed, Congress slipped the deadline but added that each compact's facility was expected to be operational by 1993. When one of the three original sites closed in 1993, no new sites had been developed, leaving only two for the entire United States, one at Richland, Washington, and one at Barnwell, South Carolina. The Washington site was closed in 1993 to any state outside of 11 states in two compacts and the South Carolina site was closed in 1994 to all states but those in the 8 state southeastern compact.

Thus, although by 1995 47 states were members of compacts, and 11 states had plans for disposal sites, only Barnwell and Richland were operating. This led to considerable turmoil as waste generators in the states barred from the two sites tried to find alternative disposal methods. The medical community was particularly upset. An NIH official said "Biomedical investigators will have to substitute new and perhaps less effective research procedures which do not rely on radioisotopes. Some may be forced to abandon research projects." Another result, he said, would be "soaring costs" as hospitals try to store wastes on site.

There is now a temporary reprieve. The South Carolina site has been reopened because of a dispute within the compact. The compact agreement was that South Carolina would close its site when the next state opened a LLW site. North Carolina was to be that state. However, North Carolina has been unable (or unwilling) to develop a site, which led to South Carolina requesting North Carolina be removed from the compact. When the compact states declined to eject North Carolina, South Carolina reopened its site to all states except North Carolina, while increasing its disposal cost. (In the 1970's, disposal cost was less than $5/ft3; it now is $350/ft3 at Barnwell, of which $235 goes to the state.) Also, a non state-related commercial facility, Envirocare, in Utah, accepts a limited set of LLW.

In October, the compact agreed to give North Carolina another $6.5 million. The local press reported: "The funds would give new life to a project that already has cost $90 million over eight years and has produced only an undeveloped site...." The governor of North Carolina recommended that the other states in the compact share the costs of further studies. The compact rejected this proposal and gave the state until 1 December to accept the existing proposal, which included NC utilities putting up $7 million, or offer another proposal. The state did neither and, in December, the compact cut off funding to North Carolilna. The state then began "the orderly shutdown of the project." Through December, $106 million had been spent on the project. This cost and time, as well as the result, are not unusual. In 1992, Illinois rejected a site that had been studied for 8 years at a cost of $85 million. New York halted its search for a LLW site after 8 years and a cost of over $55 million. The contentious Ward Valley site received a license from the State of California (but a license that cannot be exercised until the land is transferred from the federal government) that lasted eight years and cost $45 million.

What goes into these LLW sites? A 1995 Government Accounting Office (GAO) report on LLW status described the commercial uses of radioactive materials that lead to LLW:

  • operations at nuclear power plants.
  • more than 100 million annual medical procures.
  • testing and development of about 80% of new drugs.
  • sterilization of consumer products.
  • production of commercial products, such a smoke detectors.

Data on actual disposal is poor. The best is that compiled by the DOE, which divides commercial LLW into five categories:

  • Academic, which includes university-related hospitals and medical research as well as other research facilities.
  • Government, which includes state and non-DOE federal agencies, including NIH.
  • Industrial.
  • Medical, which includes the non-university related hospitals and research facilities.
  • Utilities.

Thus, it is not possible to separate out medical wastes in the DOE compilation. But utility waste has been separated. In 1994, utility waste represented 44% of the volume of radioactive material sent to the two operating LLW sites, but 91% of the radioactivity as measured in curies.

For some states, utilities represented even a larger fraction of the radioactive materials: of the waste shipped in 1995 by New York state generators, 5.3% of the volume and 99.3% of the curies were shipped by power plants.

"...two activities leading to major concern and dread are radioactivity and nuclear power. LLW sites...with waste from nuclear power plants, combine both factors...[they] need not...[if] LLW material from power plants [is stored] on site."

While many states have plans to site and construct a disposal facility, in almost all cases these plans have been halted or delayed. The GAO noted that between 1991 and 1995 all but one compact plan slipped by several years, in all cases to beyond 1996. (The one exception is the compact for a site in Texas, a compact which involves the interesting combination of Texas, Maine, and Vermont -- there is no Congressional requirement that the states in a compact be contiguous.) The reason, of course, is the opposition to siting a radioactive "dump' in a local area. A study by the National Research Council of the aborted effort in New York state concluded that public acceptance is key to success, but also noted that "...some opponents clearly and vocally stated they would not accept any site, regardless of the technical justification...." A Nuclear Regulatory Commission study in 1993 concluded seven factors negatively affected siting a LLW site, including perceptions that the regulations are inadequate and that long-term storage is more desirable than disposal. However, public and political concern appeared to be a major factor and were linked to many of the other factors.

I believe a partial solution can be found to the LLW dilemma for a substantial part of the problem. Public opposition to LLW sites has been studied by sociologists such as Paul Slovic, who attribute public opposition to activities they dread. These studies indicate that two activities leading to major concern and dread are radioactivity and nuclear power. LLW sites, being predominantly filled wi

Announcing A Celebration

At The Columbus APS/AAPT Meeting

Leo Szilard: A Symposium In Celebration Of The 100th Anniversary Of His Birth.
Leo Szilard--physicist, inventor, biologist, writer, and sometime diplomat--was the first to conceive of the nuclear chain reaction, the essential features of the carbon-uranium reactor (incollaboration with Enrico Fermi),, the cyclotron, the linear accelerator. He proposed and drafted in 1939 Einstein's letter to President Franklin Roosevelt, which led to the Manhatten Project; and he gained, during a private meeting in 1960, Nikita Khrushchev's assent to a Moscow-Washington Hotline.

His interests ranged from statistical physics to information theory to biological evolution, from atomic physics to nuclear strategy and deterrence. His novel, The Voice of the Dolphins, is a parable of the technical prowess and moral limitations of our times. This "man behind the bomb" (as biographer William Lanouette puts it), was also a man who attempted many times during 1945 to prevent the atomic bombing of Japan. Szilard devoted his life primarily to increasing the likelihood that the ever more powerful fruits of science would be used for humanity's benefit.

Program:

  • Greetings from the Embassy of the Republic of Hungary, Istvan Szemenyei, Counselor for Science and Technology
  • "Szilard As An Inventor: Accelerators and More," V.L. Telegdi, University of California at San Diego, and CERN
  • "Leo Szilard: Physics, Politics, and the Narrow Margin of Hope," William Lanouette, Author of Genius in the Shadows: A Biography of Leo Szilard, The Man Behind the Bomb
  • "The Roots of Leo Szilard and his Interdisciplinarity," George Marx, Eotvoes University, Budapest
  • "Toward A Livable World," Edward Gerjuoy, University of Pittsburgh

Laszlo Baksay, University of Alabama and Kossuth University, Debrecen, Hungary, presiding.

Sponsored by the Forum on Physics and Society, the Forum on International Physics, and the Forum on the History of Physics.

At the APS/AAPT meeting in Columbus, on Saturday at 14:30 (2:30pm).

David Schramm

It is with deep sadness that we report the death of David Schramm in a plane crash last December. Dr. Schramm, 52, an astrophysicist at the University of Chicago, member of the National Academy of Sciences, and Chair of the Aspen Center for Physics, was a candidate for vice-chair of the APS last year. A private pilot, he was en route from Chicago to Aspen when his plane crashed in eastern Colorado. He was a leading expert in cosmology, especially in applying nuclear and elementary particle physics to Big Bang nucleosynthesis and supernovae, and many of the leading astrophysicists in the world were his students. In addition to his academic accomplishments, Schramm was an influential figure in governmental science policy and was a superb popularizer of astrophysics. His death is a great loss to physics, and we will miss him.

U.S.-CERN LHC Agreement

On December 8th, representatives of the US and CERN signed an agreement outlining American participation in the Large Hadron Collider (LHC). The Collider is a 7 TeV on 7 TeV proton-proton collider, which is expected to begin operations in 2005. The United States agrees to provide $531 million in services and goods; $450 million from the DOE and the balance from the NSF. Brookhaven, LBL and Fermilab will use $110 million to design and produce advanced systems for the accelerator's interaction regions, $90 million will be used for procurements from U.S. industrial firms.. The U.S. will also provide an in-kind contribution of components valued at $331 million for the massive detectors. The American contribution will provide roughly 10% of the total cost. Energy Secretary Pena said that the agreement marked "the first time the U.S. government has agreed to contribute significantly to the construction, through domestically-produced hardware and technical resources, of an accelerator outside of our borders". NSF Director Lane characterized the LHC as "a quantum leap forward for international cooperation in science and technology, and it also represents a technological challenge of grand scale and proportion". More details can be found from FYI #148 and FYI #152.

Quantum Teleportation

In a recent issue of Nature, a group of physicists announced that they have "teleported" the polarization state of a photon. Several years ago, Charles Bennett of IBM pointed out that one could use an EPR-like setup to instantaneously transmit the polarization state of one photon to another photon which could be a long distance (even light years) away. Of course, since the initial polarization state is random, it is impossible to transmit information superluminally, as Bennett correctly pointed out. The experiment was successfully carried out. There could be important implications for the development of quantum computers. Alas, the authors referred to the phenomenon as "quantum teleportation", and, in the words of Bob Park, "the word 'teleportation' acts like a sex pheromone on trekkies, causing them to swarm in a high state of excitement". Most news reports on the subject discussed applications to a Star-Trek style transporter, conveniently ignoring the fact that superluminal transport of information did not take place (as well as the difficulties in scaling the process up by 23 orders of magnitude....).

End of the Westinghouse Search for Young American Scientists

The Westinghouse search may be ending. This highly esteemed award has been the highest honor that high school science students can achieve.Over the years, Westinghouse scholars have gone on to win 5 Nobel prizes and scores of other honors in science and mathematics.Recently, Science Service, the non-profit organizationthat administers the awards, was told to look for a new sponsor.Since it acquired CBS two years ago, Westinghouse had been morphing from a technology company to a media giant. Last month, it changed its name to CBS Corp.

Chinese Save American Science

Just when many American scientists thought that the U.S. had reached rock bottom levels of massive scientific illiteracy in the form of UFO-obsessions, psychic fortune tellers, cold fusion, etc, and just when we had reached the highest peaks of embarrassment in talking about our science public-education fiasco, who should come to our rescue but...China!

A front page article of the November 7, 1997 issue of the Wall Street Journal concerns Professor Sun Shili and his Chinese UFO Research Association's use of government grants to harness UFO technology for the benefit of Chinese society. As Professor Sun says, "UFOs are faster than any airplane or car. We hope to use the UFO phenomenon to resolve China's energy and efficiency problems."

Some of the members of Shili's group are deeply involved in research to exploit UFO technology. For example, Liu Zhongkai, an official at the Beijing Meteorological Bureau, has invented something that he describes as a magnetic field that produces as much as a third more energy than it requires to run. He claims that his magnetic field can alter time. "If you live to be 100 on Earth, in my UFO you will be able to live at least 100,000 years."

As to how these people imported UFO technology without ever having been inside a UFO, Ma Ruian explains," I've studied many photographs of UFOs. In physics, you can work backward to figure out the theory."

Professor Shili, who used to do translation work for Mao Zedong, explains that doing UFO work isn't easy, "Working with UFOs is more complicated than translating for Mao."

Good News From Washington!

During the past few years, federal support for basic scientific research has been declining. Now, in the words of APS President Allen Bromley, "the activities of scientists throughout the country have produced a remarkable turnaround in the way science is treated in Washington". In particular, the National Research Investment Act (S1305), sponsored by Senators Gramm, Lieberman, Domenici and Bingaman, will "invest in the future of the United States by doubling the amount authorized for civilian basic scientific, medical and pre-competitive engineering research during the next ten years" In a letter to the President, the above co-sponsors said that "it is imperative that we reverse the recent trend toward decreased federal investment in basic and pre-competitive research if we are to sustain robust growth over the next several decades....We believe that this winter is a critical time for establishing R&D investment as a national priority. Both parties have largely cleared the decks with respect to the agendas they have been pursuing for the last two years and both are seeking new political initiatives to advance. Moreover, recent changes in the projected five-year revenue outlook give both parties more room to maneuver within the confines of the balanced budget agreement." The bill was unanimously endorsed by the APS Executive Board in November, and a similar statement has been endorsed by the presidents of 106 professional societies representing 3 million scientists. As a result of this shift in Washington, research budgets generally rose by 5 to 7 percent for FY 1998. The role of the APS and its members is descibed in detail in the January issue of APS News. It appears as if a lot of hard work is finally paying off. There remains much to be done, of course. The bill does not explicitly appropriate funds, and no companion measure is in the House. In December, Dr. Bromley asked all APS members to contact the White House, which was preparing the President's FY 1999 budget. This seems to have worked. The White House's FY 1999 budget request is unprecedented, calling for increases at 11% in the NSF and DOE budgets, with an 8% (5%) increase in overal basic (applied) research. Vice President Gore said that research was one of "the top priorities in this budget". Congress seems to agree. White House sources claim that the letter-writing campaign was a factor in the turnaround. The Administration claims that the costs can be paid from the tobacco settlement.

Representative Ehlers on Science Policy

"Literary intellectuals at one pole--at the other, scientists, and as the most representative, the physical scientists. Between the two a gulf of mutual incomprehension - sometimes (particularly among the young) hostility and dislike, but most of all lack of understanding"--C.P. Snow, 1959.

Congressman Vernon Ehlers (R-Mich.), the only physicist in Congress and vice chair of the House Science Committee, quoted C.P. Snow in an address at the University of Maryland in November. His talk, (see FYI #140), discussed bridging the gap between scientists and non-scientists, and he called on physicists to participate in policy formation for "the good of your country and for your science". He said it has "become fashionable to be ignorant about science" and noted that most Members of Congress are "woefully lacking in scientific knowledge". He criticized Congress for not differentiating between basic and applied research, as exemplified by the move some years ago which called for the NSF to engage in "strategic research". House Speaker Gingrich and House Science Committee Chairman Jim Sensenbrenner have asked Ehlers to conduct a study leading to the writing of a new federal science policy, which is to be forward-looking and is to outline the roles of the federal, state and international governments, industry and universities.

In remarks last fall, Ehlers laid out the rationale for this study: "The basis for our economic engine in this nation is science and technology. The discoveries that we make today are going to fuel the economy 30 to 50 years hence, just as today our economy is fueled by the discoveries of three to five decades ago. Thus it is very important for us to have a national science policy that reflects that change in atmosphere between the U.S. and the rest of the world, that reflects the change in science, that reflects the change in foreign relations, and that, in particular, reflects the change in economic structures in theworld today."

This effort started with a meeting of 35 prominent scientists and policy makers, and a similar meeting with "early career" scientists. In each meeting, Ehlers posed a series of policy questions to formulate an overall "vision." These same questions are posted on a science committee web site where Ehlers invites citizen participation.
The questions are: "1. On what broad national goals should federal science policy be based? 2. (a) What is the government's role in supporting basic and applied research? (b) How can the government best encourage an effective level of industry investment in pre-competitive research? 3.How can the nation enhance and make the most effective use ofgovernment/university/industry research partnerships? 4. What is the most effective role for the states in supporting university research, and how can the federal government best support that role? 5. (a) Given the increasingly international nature of science, how can the nation best benefit from and contribute to international cooperation in research? (b) What types of multilateral science agreements are needed to facilitate international collaboration? 6. How can the federal government best help meet national needs for science and math education at all levels? 7. How can the nation most effectively leverage federally funded R&D in the face of increasingly constrained resources?"

Ehlers will hold six hearings relating to this study between March 3rd and April 22nd on topics including "defining a science and technology vision," science education, partnerships/collaboration, international science, and "funding sources for research." Dates and a brief statement describing each of these hearings can be found on its web site at http://www.house.gov/science/science_policy_study.cfm This site contains the questions listed above, background materials, and other items of interest. In a section on this web site entitled "We Need Your Help" several issue areas are listed. The science community is also asked to provide information on the hearing topics. Given the magnitude of what Rep. Ehlers and his staff want to accomplish, individuals responding to Ehlers' invitation should bear in mind an observation made on the study's web site: "Remember, the more we have to read, the greater the risk we'll miss your key conclusions!"

Ehlers, who wants this to be an accessible document for his colleagues, describes it as being "concise, coherent, and comprehensive." The target date for the study's completion is the middle of this year. A longer term goal is possible legislation.

Jack Gibbons Retires, Neal Lane to OSTP, Rita Colwell to NSF

(From FYI #29) Three prominent scientists are changing their positions here in Washington. Dr. John Gibbons, Assistant to the President for Science and Technology, and Director of the Office of Science and Technology Policy, has announced his retirement. President Clinton announced his intention to nominate NSF Director Dr. Neal Lane as Gibbons' replacement. Replacing Lane, the President said, will be Dr. Rita Colwell, currently the President of the University of Maryland Biotechnology Institute.

Gibbons tendered his resignation to President Clinton . Gibbons wrote: "It has been an extraordinary honor and privilege to be your science advisor for over five years. I am grateful for the remarkable opportunity to cap my four decades of public service by serving you and our country. While I remain committed to your success, I believe that now is an appropriate time to submit my resignation, to be effective March 15, 1998. I look forward to continuing my efforts to build bridges between people, disciplines, and institutions."

His letter continues, "I take my leave with a sense of deep humility and immense pride -- humility in being associated with great American scientists who have gone before me, pride in this nation's unmatched scientific establishment. The tools of science and technology have provided greater strength, greater resources, and a greater quality of life for all Americans. In private life, I will work as hard as I have in the White House to keep us on the path to scientific preeminence, as well as to ensure that science and technology nurture the values and ideals that gave us birth as a Nation."

President Clinton accepted the resignation with regret during a speech at the annual meeting of the American Association for the Advancement of Science. Clinton thanked Gibbons for his years of service in Washington. Before coming to the White House, Gibbons was the Director of the congressional Office of Technology Assessment.

Commenting on his resignation, Rep. George Brown, Jr. (D-CA) said, "I don't think any science advisor ever served in more trying times for science than did Dr. Gibbons. Crowded by efforts to shrink the deficit, shouted at by ideologically driven voices of irrationality, and sometimes prodded by friends who thought he should do more, Jack's term was not all sweetness and light. But Jack spoke forcefully for reasoned policy and legislation, and he will be remembered as a principled advocate for science in a time when irrational forces might have capsized the enterprise. Jack also worked persistently within the White House to defend science budgets from the competing claims of other worthy needs. The result of those efforts is the superb set of proposals to support science and technology in the President's 1999 budget. In short, Jack is leaving at the top of his game and reaping the applause he so richly deserves for a job well done. We are very old friends and I believe that Washington will be a less interesting place without him."

Gibbons worked with the President and Vice President for more than a year to select his replacement, Neal Lane. Of Lane, Rep. Brown said, "Neal Lane has done a terrific job at NSF. Stepping into the post at OSTP will require that he shift his focus from what is best for the Nation's academic research system to a broader conception of what is in the National interest in all aspects of science and technology. I am sure that he is up to the task and will be an able and talented advisor who will make his mark in White House inner circles." Lane's nomination will go to the Senate Commerce, Science, and Transportation Committee, chaired by Senator John McCain (R-AZ).

Dr. Rita Colwell, nominated as the new director of the National Science Foundation, has a Ph.D. in marine microbiology from the University of Washington. She has served on the National Science Board and has been the president of the American Society for Microbiology, the International Union of Microbiological Societies, and the American Association for the Advancement of Science. Brown said of Colwell: "Rita Colwell has a terrific track record at AAAS. Her experiences both in Washington and academia make her a great choice to head NSF. I look forward to working with both Dr. Colwell and Dr. Lane for many years to come." The Senate Labor and Human Resources Committee will hold a confirmation hearing on Colwell's nomination. Senator James M. Jeffords (R-VT) chairs this committee.

The "Dual-Career Couple Problem"

As the number of couples in which both members are trained in science increases, more and more people are facing issues in finding two science-based jobs in the same geographic location. Many scientists are forced to give up or drastically scale back their careers, others are forced into long-distance commuter marriages, nearly all "science couples" must make major compromises. Given that there are more male scientists than female, a much higher proportional of women scientists must deal with this problem (which thus perpetuates the high male/female ratio in science). In an effort to document the scope of the situation for physicists, and find examples of solutions to the problems which arise, Marc Sher (News and Electronics Editor of P&S) and Laurie McNeil (past chair of the Commission of the Status of Women in Physics) have put together a questionnaire. Respondents are asked to describe the positive and negative responses of employers to the problem, the effects on their career plans, children, etc. They hope to put the results together this summer, and write up an article for either Physics Today or APS News this fall. The questionnaire is available at http://physics.wm.edu/survey/

Part-Time/Adunct Faculty

Two reports have recently appeared which address a growing problem in academia--the extensive use of part-time and adjunct faculty. Although a more pressing issue in the humanities (where up to 50% of faculty are part-time/adjunct), it is a serious problem in the sciences, where 25% of faculty are part-time. This problem is related, of course, to the dual-career couple problem noted in the above news item. Two reports have just come out detailing the problem and recommending solutions. One is by the American Federation of Teachers and can be found at (http://www.aft.org//higheduc/partime.cfm). The other is a report by eight professional organizations, including the AAUP and the American Historical Association and can be found at (http://www.chnm.gmu.edu/aha/). Among the recommendations of both reports are limitations on the use of part-time faculty; giving part-time faculty salaries linked to those of full-time faculty, rather than wages per credit hour; giving part-time faculty rank and eligibility similar to those of full-time faculty; giving contracts as far as possible in advance of classes, and ensuring a much greater degree of job security than currently available.

Radon and Lung Cancer

In What's New (20 Feb 98), Bob Park reports that the report of the National Research Council on Biological Effects of Ionizing Radiation deals solely with radon. Its purpose was to consider new evidence on residential lung-cancer risk obtained since the 1989 report. The report blames radon for about 18,000 lung cancer deaths per year, mostly (90%) among smokers. The new figures are still based on a linear-no-threshold extrapolation from data on uranium miners. The panelists insisted that repair mechanisms that may produce a threshold for penetrating radiation (WN 30 Jan 98) are not relevant to alpha particle damage. The report acknowledged that a threshold could exist and not be identified from the data. (cf., articles in this issue) A day earlier, a group called Radiation, Science & Health, held its own press conference to argue that, even at the highest residential exposures, radon is not only harmless, but beneficial. Studies by physicist Bernard Cohen (cf. Physics and Society,Jan., 1997) found lung cancer rates are consistently lowest in areas where radon levels are highest. Such "ecological" evidence was dismissed by the new report, which relied entirely on case control studies. As Park says, "this is reminiscent of the debate between physicists and epidemiologists in the EMF wars. The physicists were right."

Science and Religion: An Evolutionary Perspective

by Walter Isard.

Avebury Publishing Company, 407 pages, $99.95. ISBN 1 85972 4752
Physicists live in the broader civil, cultural, and religious society as well as that of their profession. Hence, many have long been interested in the possibilities of the application of the concepts and methodology of physics to broader social questions, the more so since physics has usually been perceived as "successful" whereas the "sciences of society and culture" are much less so. Many of us feel that we are "needed" out there, that we bring distinctive means for helping to understand the human endeavor and hence further the possibility of its successful practice.

This book, written by an eminent economist, seeks to find commonalties of description and analysis in art, in the physical, biological and social sciences, and in religion. Among these concepts are hierarchy, symmetry and symmetry breaking, determinism and indeterminism, chaos and order, dynamic and linguistic modes, entropy and self-organization, evolution and catalysis, genes and memes, etc. Isard ranges from the "Omega" of Teilhard de Chardin to the "implicate order" of David Bohm. Here is a book wherein the physicist can see unusual applications of his/her usual concepts. Hamiltonians--to macroeconomics; phase transitions--to welfare; force field potentials--to leadership influence; Master and Fokker-Planck equations--to population distributions; etc. Very few readers will be familiar with all of the subjects and sources covered and yet, for the most part, Isard manages to keep the sophisticated reader's interest throughout this wide-ranging discourse. In his search for common elements, he hopes that the diversity of applications will lead to deeper understanding of these elements and hence, as a feed-back loop (another common element), to still deeper and more useful applications.

Books with such diverse yet deep coverage, and addressed to general audiences, are becoming more common. For example, I recently came across: Thinking in Complexity: The Complex Dynamics of Matter. Mind. and Mankind by Klaus Mainze, and Chaos in Discrete Dynamical Systems: A Visual Introduction in 2 Dimensions by Ralph Abraham, Laura Gardini, and Christian Meia. These books raise a number of questions for those of us interested in the mutual interactions and well being of science and society. They cover more diverse physics than is in the working armory of most physicists, let alone non-physicists. Who are these books aimed at? What are their goals, and how realistic are they? Do they succeed? The two books just mentioned are addressed to scientists and technologists, whereas Isard is addressing intellectuals more generally.

Most readers will be familiar with only a subset of Isard's problems and concepts. Isard attempts to explain some of the concepts, but just quotes descriptive phrases for some of the others. I have grave doubts as to whether the explanations are sufficient for novices in these fields. He covers an enormous vocabulary. But is it useful (except perhaps for cocktail party conversation) to acquire vocabulary without real understandings of the concepts and their usage? And can a single book, even of 400 well-written pages, initiate novices into so many mysteries?

For example, what gain in understanding is made by describing the sudden flowering of classical art in 5th century BC Greece as a "Prigogine jump to another far-from-equilibrium state" (p. 217)? Or by describing the short-lived Athenian development of perspective painting as "a non-amplified perturbation" (p. 218)? Even granting a familiarity with the arts in question (there are no illustrations!), one would have to have some notion (which I still don't have) of what an "equilibrium art state" meant before perturbation--large or small--of that state communicated any real insight. Equilibria, and their disturbances, imply forces to the physicist. What are the artistic analogues of these interactions? And what is the point of talking about a "Prigogene jump" from community to individualism (p. 290), when the relevant equilibrium state is not defined? Why talk about symmetry breaking in religion (p. 293) when the underlying symmetry is not evident? In general, what is the gain in transferring the vocabulary of one field to a description of the problems of another field unless an appropriate basic theoretical foundation is first constructed? And can many such structures be erected in a finite practical book? In other words, why should the application of physics ideas to the areas of art, religion, society, etc., be any easier or quicker than their application to the area of physics?

There are also cases where presumed commonalties are pushed so far as to be erroneously or misleadingly used. For example (p. 346), in the High Middle Ages, changes in agricultural technology furthered the growth of monasteries in Western Europe, which in turn contributed to further advances in agriculture. Isard says "The advances were cross-catalytic." But, by the usual definition, the catalyst is not changed by the process of catalysis whereas each of the elements in Isard's cross catalysis were certainly changing each other. If it is desirable to use a physico-mathematical concept, rather than the usual historical term of mutual influence, the appropriate phrase would be "feedback loop."

This is not to say that such books are of no use. Isard demonstrates the power of cross fertilization of fields by comparing the development of the idea of the gene as the self-perpetuating unit in biological evaluation, with that of the "meme" as the self replicating unit of cultural evolution. His discussion of the co-evolution of genes and memes is clear and interesting and certainly displays the commonalties of the biological and cultural sciences. And there are many historical-political insights which are quite valuable, whether or not they demonstrate commonalties with the natural sciences. For example: "...as a hierarchical structure becomes composed of more and more levels, impersonal relations more and more replace personal ones, and more and more possibilities become available for distorting the flow of information and for allowing corruption to take place and incompetents to hold office" (p. 344).

In addition to the massive melange of ideas, relationships, and historical-social-physical-biological "facts," the book contains ample references and notes, sufficient to allow an interested reader to probe further into any of the concepts introduced by Isard, and perhaps to transcend the line between vocabulary and knowledge. And, the sketchy discussions in the book are usually sufficient to raise interest. Perhaps that's all that can be expected in a reasonably-sized book, and this one fits the expectations better than most.

Alvin M. Saperstein

Wayne State University

Particles in our Air: Concentrations and Health Effects

Richard Wilson and John Spengler, editors

Harvard University Press, Boston, MA, 1996, ISBN 0-674-24077-4

This book is an excellent summary of the epidemiology case for the increased stringency of the U.S. Environmental Protection Agency's (USEPA) new particulate standard for fine particles less than 2.5 microns in aerodynamic diameter, called PM2.5. While appropriately referenced to the original literature, this book is written for a more general, but scientifically literate audience, to present the case for the more stringent PM standard to a wide audience. The case was undoubtedly successful: in 1997 EPA promulgated a very stringent standard in response to the type of data included in this volume. However, the book is clearly biased, as it excludes all of the authors and much of the literature which question the epidemiological point of view, and draws its conclusions almost exclusively from the epidemiological evidence, which by its nature can only draw associations and cannot define a cause and effect relationship. Likewise, this book goes outside the epidemiology evidence to recommend cost benefit control strategies, which are clearly beyond the expertise of the authors.

When reading this book, consider whether damaging health effects are more likely to be caused by specific chemical effects involving chemicals such as sulfate, or whether all particles of a certain size act in the same fashion, regardless of their chemical composition, to produce mortality.

Wilson's introduction presents the concept of harvesting and notes that several authors have found significant negative correlations consistent with the harvesting hypothesis, but those authors' viewpoints are not included in this book. Wilson notes that when several air pollution variables are collinear, epidemiological studies by themselves cannot distinguish which is the causative agent. The book adds animal data in chapter 5 and human experimental studies in Chapter 8 to ascribe the biological plausibility, which has been lacking in previous treatments of the subject.

Chapter 2 is a qualitative description of air pollution monitoring which is largely irrelevant to the theme of the book. It does not provide information regarding the extent of monitoring or the efficacy of central stations in measuring personal exposure.

Chapter 3 presents an emissions inventory table with no units, rendering the information almost useless. Contributions of different PM sources are discussed and documented, and differences in chemical composition of fine particles in selected sections of the United States are presented, but hard data on PM2.5 concentrations are lacking. The Gaussian plume equations included are not used, and imply more sophistication than is actually present in this text.

Chapter 4 addresses the crucial question of whether central outdoor ambient air quality monitors are an adequate surrogate measure of individual personal exposure to particulate matter. Since the USEPA has determined that on average, people spend over 80 percent of their time indoors, and since highly individual activities greatly influence indoor personal exposure, can a central monitor correlate with the true cause of health effects, if those health effects are caused by personal exposure? While the authors, Ozkaynak and Spengler, note that small particles have a large potential to penetrate from outdoors to indoors, central station outdoor measurements at current ambient levels are not strongly associated with personal exposures to particulates. Nonetheless, the authors conclude that epidemiology demonstrates a significant positive effect of centrally measured particulate concentrations on public health. It would seem that we must instead conclude that something correlated with centrally measured particulate concentrations causes the health effects.

Chapter 5 should be a crucial link between the associative inferences of epidemology and the establishment of causation of public health effects. The right studies would establish biological plausibility. Unfortunately, this chapter turns to issues of specific chemical toxicity, focussing primarily on sulfate particles, rather than finding studies which deal with solely particulate matter at or near ambient concentrations, but making the emphatic point that extrapolation from results obtained on high concentrations is not appropriate.

Chapters 6 and 7 discuss the epidemiological evidence that definitely demonstrates an association between particulate matter, normally measured as PM10, and various categories of mortality and morbidity. These studies do not support the finding of a threshold, beneath which health effects do not occur, a concept that underlies the fundamental approach of the Clean Air Act. The CAA requires the setting of standards for public health with a safety margin below the hypothetical threshold, and hence this observation, if correct and if causative, presents a serious problem for our control strategies. Time series studies, which use acute health indicators, avoid some of the confounders, such as smoking and urban lifestyle, to which chronic geographical cross section studies may be subject.

Utel and Samet conclude in Chapter 8 that "neither clinical experience nor review of the literature identify a direct pathophysiological mechanism that can explain the relationship between inhaled particulate and mortality."

Chapter 9 presents a probabilistic model of loss of life expectancy, which is interesting, but does not add to the causality argument.

Chapter 10 is an outstanding summary both of the book and of the national debate, which preceded EPA's action and which continues today. There is little doubt that there is an association of mortality and morbidity rates with particulate matter, but the scarcity of data on PM2.5 and closely associated sulfate does not allow a causative relation to be established, despite the author's claims. Existing data on animals appear to suggest sulfate, or some other agent in the complex combustion products that make up fine particles, as the cause, but the authors note that there is insufficient data to investigate this obvious confounding factor. The authors also note that the acute time series studies, which measure almost immediate harvesting, and the chronic studies of reduction of life expectancy in years, give the same coefficients, raising the question of whether the acute and chronic studies are actually measuring an association with the same phenomenon. Hopefully statements such as this can open the door to a learned discussion of other possible causes, overcoming the epidemiological fixation on PM2.5. In attempting to move from association to infer causality, the authors review Hill's attributes. They admit that the strength of association with risk ratios only slightly above unity are not normally satisfactory. They conclude that the association is consistent, but their arguments only hold if sulfates and PM2.5 are identical. The temporality criteria seems to be met, since all studies show that disease follows exposure, and certainly the dose response increase of effect with increasing dose is positive.

Hill's biological plausibility attribute is the most difficult attribute, and this book does not add evidence to suggest that this attribute has been met. The specificity of the association is fairly focused, but the reliance on cardiovascular elements is one of the continuing lines that divide experts in the air pollution field. Because animal data does not contribute to our understanding as yet, we cannot accept the coherence attribute. In sum, we are no nearer to unanimous fulfillment of Hill's criteria as a result of this book.

Finally, the authors go beyond their title and expertise to attempt a cost-benefit and technology assessment in two pages, using the full cost of a life rather than a marginal benefit consistent with harvesting. The distorted results suggest abundant resources for abatement, which the authors advocate despite a high probability that the wrong control target might be chosen.

In summary, this book should not be read alone. To balance its perspective, the reader might also read the critical review, "Ambient Particles and Health, Lines that Divide," by S. Vedal, in the Journal of the Air and Waste Management Association 47, 551-581 (1997).

Ralph H. Kummler
Dir., Hazardous Waste Mgt. Prog.
Assoc. Dean for Rsch
Coll. of Engineering, Wayne State U.
Detroit, MI 48202

Are We Seeing Global Warming?

K. Hasselmann, Science, 9 May 1997, 914-915

The summary graph and references in this "Science Perspective" tell the story: There can now be discerned a visible trend upward in global temperature of about 0.5oC over the past century. The article mentions some of the data analysis justifying this conclusion, which reverses a 1990 Intergovernmental Panel on Climate Change report. Although Hasselmann leans toward attributing the trend to civilization, he points out that the trend alone can not justify attribution of a cause. The article does not describe long-term data, such as the "Little Ice Age" of the late Middle Ages--which might confound the issue of human vs nonhuman influence.

Maximum and Minimum Temperature Trends for the Globe

D. R. Easterling, et al, Science, 18 July 1997, 364-366

These authors summarize the increasingly accurate air-temperature data available since about 1950 at about 1500 monitoring stations worldwide. They present results showing a rate of increase in nighttime maxima averaging about 1.8 oC per 100 years, and a corresponding increase in daytime maxima of about 0.9 oC. So, these measures of global warming show a decreasing difference in temperature between night and day, with most of the change at night. The authors suggest that increasing cloudiness may help explain this phenomenon.

Millennial Climate Oscillations

Delia Oppo, Science, 14 November 1997, 1244-1246

This perspective on paleoclimatology opens with the statement that it is the Milankovitch cycle of the Earth's orbit that causes long-term climatic changes including ice ages. The author then raises the question of how to explain the too-rapid, too-frequent millennial changes seen in the fossil record.

The author claims that each millennial warming trend is preceded by a 5 to 8oC warming over Greenland, which is followed by some millennia of relative global warmth and then a sudden drop back to global cold. After discussing frequency of rock fragments from icebergs, which is said to represent rate of iceberg-ice transport, the author concludes that millennial global temperature changes, including the Little Ice Age ending around 1900, depend mainly on deep-water oceanic convective currents. The suggestion is that the atmosphere-ocean system entails temperature changes amplified by ice sheet extent.

In this article, the author does not state whether the global warming heat flow is supposed to be driven by the Greenland prewarming, or whether the Greenland temperatures merely represent a lower average thermal inertia in that region. Clearly, the heat capacity of the system would be of greatest importance in deciding what the effect might be of the additional atmospheric heat from an increase in greenhouse gasses caused by civilization.

Arctic Environmental Change of the Last Four Centuries

J. Overpeck, et al, Science, 14 November 1997, 1251-1256

This is a must-read paper for anyone interested in the science of the recent global warming. The 18 authors review data from all available sources to conclude that major factors in the end of the Little Ice Age around 1900 were decreased volcanic activity and increased insolation--the latter more because of increased irradiance than greenhouse trapping. This reviewer also would have hoped for an astrophysical observation of a solar cycle--perhaps one correlated with the purportedly "missing" solar neutrino flux of the current day.

The authors make the point that the reflectivity of surface areas covered by snow or ice provide an amplification mechanism for millennial-scale global temperature changes. Regardless of the weight of human activity in the input, the current warming trend will continue at a rate difficult to anticipate.

Thermohaline circulation, the Achilles Heel of Our Climate System: Will Man-Made CO2 Upset the Current Balance?

Wallace S. Broecker, Science, 28 November 1997, 1582-1588.

The precession of Earth's axis of rotation, along with secular changes in its orbit about the Sun, define the Milankovitch frequency components, the shortest having a period of something over 20,000 years. But recent ice cores and other data seem to show drastic global climatic changes over much shorter, millennial-scale, periods.
Water, in vapor state described by the author as the dominant greenhouse gas, increases in density in liquid state with decreasing temperature or with increasing salinity.

The author postulates a deep-sea circulatory pattern with two dense-descent regions, one along the Antarctic shelf, and the other in the North Atlantic. Using the duo as a mechanism of instability, the author presents evidence that past interludes of global warming have stalled this thermohaline circulation merely because of superficial temperature increases. The pattern includes the Gulf Stream which, if stalled, would allow extreme weather patterns jeopardizing current food production in or near Europe.

The author presents the deep-water factor as a fundamental weakness of current climate models, several of which he describes. He points out that a lack of accurate, very long-term, data makes it difficult to separate civilization from other causes. Although he cannot offer a prediction, the author advocates reduction in civilized atmospheric CO2 as a precaution. He suggests delivery of energy in noncarbonic forms, as in hydrogen fuel cells, for which consumption would produce only water vapor as a waste product of the end-user.

Sensitivity of Boreal Forest Carbon Balance to Soil Thaw

by M. L. Goulden, et al, Science, Science, 9 January 1998, 214-217.

The twelve coauthors of this report used a variety of instruments to measure the carbon flux (as organic material and oxide) from a black spruce forest in Canada. The usually-frozen soil contained an average of about 150 metric tons of carbon per hectare, as compared with about 50 tons in wood. Extrapolated world-wide, such boreal soils contain around 3 x 1011 tons. If oxidized entirely, the total could increase atmospheric CO2 by some 50%.

Monitored from 1994 to 1997, the soil was found to have lost an average of about 0.3 +/- 0.5 ton of carbon per hectare per year. This reviewer is reminded of peat bogs which may support underground spontaneous combustion when drained. The authors attribute the average soil carbon loss to ongoing global warming.

The authors do not discuss the mechanism of carbon oxidation in the soil during the brief summer thaws; presumably, it was because of fungal activity. It is unclear to this reviewer how one might predict the shift in equilibrium of carbon balance, should increasingly warmer summers cause an adaptive change in the oxidative mechanism. In any case, as the authors conclude, global warming would be expected to increase the rate of soil carbon oxidation and release into the atmosphere.

John Michael Williams

P. O. Box 2697

Redwood City, CA 94064

Figure 1