Personal tools

Summer1990.txt

The CPSR Newsletter

Volume 8, No. 3 COMPUTER PROFESSIONALS FOR SOCIAL RESPONSIBILITY Summer 1990

Chris Morris, 1990

Computer Tools for Environmental Policy Analysis Lessons from Acid Rain
Max Henrion

Computer models have long been a key tool for scientists to understand and predict the emission,
movement, and effects of environmental pollutants. Computer models are also increasingly being built
to evaluate public policies on environmental issues. What role do or should such models play in debates
on environmental policy? Is there a danger that inappropriate or biased models can obfuscate debate or
justify bad policies, accidentally or deliberately? And how can the modelling tools affect this? Acid rain
provides a provocative case history as a focus of computer modelling for environmental policy analysis
over the last decade.

The U.S. Government's National Acid Precipitation Assessment Program (NAPAP), now at the end of its
planned nine-year lifetime, is preparing its final assessment of the situation. This is intended to
summarize a decade of research on acid rain, on which U.S. expenditures over the 1980s have totalled
about half a billion dollars. Among the many products of this research are between twenty and thirty
separate computer models addressing such questions as: how are future emissions of sulfur dioxide,
nitrogen oxides, hydrocarbons and other atmospheric pollutants from power plants, automobiles, and
other sources likely to change in response to proposed legislation? How are these pollutants
transported by the wind, chemically transformed, and deposited as dust or in rain? What impacts do
they have on forests, lakes, fish, crops, human health, and the erosion of ancient monuments? How will
these evolve over time, and how will they be distributed spatially in this country and its neighbors?

These models range from small spreadsheets to predict the effects of smokestack height on local
pollution concentrations, to vast programs that may take hours to run on a Cray, such as models of
photochemical smog formation in the Los Angeles basin. They have been developed by a variety of
research groups, at government labs, universities, and consulting firms. Even after a decade of work,
there are serious problems in trying to link them together to provide an integrated analysis. The
programs run on different machines, with different languages, data formats, assumptions, and levels of
spatial and temporal detail. Many of these models are large programs, developed over a period of time
by a changing cast of programmers, most of whom are primarily scientists rather than specialist
software engineers. Accordingly many are subject to the familiar woes of software developmentÑ
incomplete specifications, limited modularity, inconsistent documentation, and incomplete testing.
Effective empirical validation is often difficult or impossible for models designed to predict impacts
beyond past experience.

Policy Models for Acid Rain

A small number of models have been designed from the start to analyze multiple impacts of proposed
policies in a single integrated framework. These impacts may include benefits of reduced damage to
forests, lakes, monuments, and human health, as well as the economic costs of cleanup technology (e.g.,
power plant flue-gas desulfurization, catalytic converters, or conversion to cleaner fuels). Figure 1
shows some key elements of a policy model in the form of an influence diagram.

Pollutant emissions

Atmospheric transport

Atmospheric deposition

Forest impact

take

Health impact

Net

Figure 1

Such integrated models employ broad-brush approaches, and cannot contain the level of detail possible
in specialized scientific models. But they may use the results of detailed models in abstracted form, for
example representing a complex atmospheric transport model as a simple transfer matrix between
pollutant sources and destination regions.

Some of these policy models employ techniques from cost-benefit analysis or decision analysis to
address uncertainties and value preferences in explicit quantitative form. Most model parameters, such
as the actual efficiency of new clean-up technologies or the sensitivity of fish to acidity, are subject to
considerable uncertainties. These can be modelled by subjective probability distributions to represent
the range of current engineering and scientific opinions. The uncertainties can then be propagated
through the model to assess the resulting uncertainty in the outputs, such as the number of lakes
without fish, or acreage of forests affected.

A few policy models also allow explicit specification of multi-attribute utility functions. These
represent value judgments about the appropriate tradeoffs between clean-up costs and environmental
impacts. How many acres of undamaged forest are worth an extra cent per kilowatt-hour on utility
bills? They may also model public attitudes to risk and uncertainty.

Examples of integrated policy models, which exhibit at least some of these characteristics, include:

ADEPT, a decision analytic model developed by Decision Focus Inc, a consulting firm, for the Electric
Power Research Institute (EPRI), funded by a consortium of U.S. power companies.

¥ ADAM (Atmospheric Deposition Analysis Model), a probabilistic policy-directed model to assess the
costs and effects of policy in the U.S. and Canada, developed by a team (including myself) at Carnegie
Mellon for the U.S. EPA.

¥ RAINS, covering impacts over Europe, West and East, developed by the International Institute of
Applied Systems Analysis (IIASA), an internationally staffed and funded think-tank in Austria.

¥ ACIDRAIN modelling effects in Northern Europe, developed by Stephen Watson and colleagues at the
University of Cambridge supported by the U.K. Department of Energy.

The avowed goal of these models is not to provide a definitive evaluations of specific policies, but rather
frameworks for exploring the implications of alternative assumptions and conditions. Versions of these
models are (or were) available on personal computers to interested parties. To varying extents and
with varying ease, the parameters and assumptions of these models can be modified to reflect
alternative knowledge, uncertainties, and preferences, and analyzed to explore their implications.

What Are Policy Models For?

Such policy models have been criticized on a number of counts. Some argue that the use of subjective
probabilities and preference values fatally undermines their objectivity, and facilitates their use for
spuriously justifying flawed policies. Others take issue with the fundamental assumption of cost-
benefit and decision analysis that all relevant impacts are quantifiable and commensurable. Even those
who accept that assumption in principle may question the propriety of applying it when the
complexities and uncertainties are so large. Modellers reply that it is precisely in the presence of such
complexities and uncertainties that human intuition is likely to fail and require the support of formal
analysis. The representation of uncertainties allows explicit examination of the precision and
reliability of results. Analysis of the sensitivities of conclusions to uncertain parameters helps
identify which assumptions are critical and which, though perhaps intrinsically large, are
unimportant in the specific situation. Explicit representation of value judgments should make it clear
that "objectivity" is not the goal of such analysis. Rather the point is to help identify which
disagreements and subjective judgments are most significant.

These arguments reflect alternative views of the role of modelling. Traditionally the goal is to obtain
the "correct" answer, or at least more accurate predictions. Increasingly policy modellersÑfollowing
Richard Hamming's famous dictum, "The purpose of computing is insight, not numbers"Ñare coming to
recognize that the ultimate goal is improving the insight of policymakers. Most policymakers are
understandably unwilling to treat models as black boxes, and to accept recommendations of the "optimal
decision," subject to mysterious criteria for optimality. But if they can understand the key
assumptions and calculations underlying a conclusion, they may obtain deeper insights which could
eventually lead to better founded decisions.

Of course, a single model, no matter how good, is only one contributor among many to the policy debate.
This dialectic may be informed by data, quantitative analyses, and qualitative arguments from a wide
variety of participants, including government, industry, environmental and citizen groups. To the
extent that rational argument is a factor in the process, policy models can be effective only to the
degree that they are easily open to review, exploration, and critique, and refinement.

Could Better Modelling Tools Help?

This view of environmental models as a medium for participating in the policy debate has some strong
implications for model design. Structural assumptions, mathematical relationships, and parameter
values must be easily accessible, clearly documented, and easily modifiable. Uncertainties should be
explicit. Tools for sensitivity analysis must be provided. Perspicuous graphics should display the
structure and behavior of models to facilitate review, critique and refinement.

As suggested above, traditional approaches to environmental modelling software may do more to
obstruct than support these goals. But newer software technologies suggest ways to break this impasse.
With colleagues at Carnegie-Mellon over the last ten years, I have explored the design and evaluation of
modelling software with these goals in mind. Techniques we have developed include:

¥ The use of non-procedural language, specifying models by unordered sets of mathematical relations
between variables, instead of encoding models in sequence-specific procedural forms, as in Fortran or
Pascal.


Variable RainpH

Units pH

Title pH of precipitation

Description Annual average pH of precipitation as a function of the sulfate ion concentration in precipitation.

Definition RpHIntcpt - RpHSlope *Log10(2*SulfConc + BasepH)

Inputs

Outputs

Explanation Converts from ion concentration, bearing in mind the double charge of Sulfate ion, using
logarithmic relation and empirical parameters for intercept and slope. See Barkington and Withers
(1974).

Table

Graph

Cdf

Pdf

Figure 2

¥ The integration of explanation and documentation of models with their mathematical structure in
easily navigable hypertext. An example variable specification, with its mathematical definition, units,
description, and reference to source material is shown in Figure 2.

¥ The use of hierarchical influence diagrams to display model structure and dependencies, and support
model exploration. (See Figure 1.)

¥ The attachment of an uncertainty to every numerical value, which can be combined and propagated by
probabilistic simulation.

¥ Tools for interactive sensitivity and uncertainty analysis to identify key assumptions and
uncertainties.

These techniques have been made available in Demos (Decision Modelling System) as an experimental
testbed to evaluate their effectiveness. Demos has been used to develop a variety of engineering-
economic and policy models, including ADAM, the acid rain model mentioned above. Evaluation studies
suggest that in the hands of skilled users, such software can do much to facilitate clearer, better
documented models that are easier to understand, critique, and modify, and that it encourages more
thorough treatment and analysis of uncertainties. However, like much powerful modelling and
presentation software, it also allows unskilled users to more rapidly create apparently impressive, but
confused and confusing analyses.

The Policy Impact of Environmental Modelling

In recent years European nations have agreed on significant reductions on acid rain precursors. A 1988
directive of the EEC commits members to reduce sulfur dioxide emissions or transborder flows by a
total of 57% from 1980 levels by 2007, and nitrogen oxide emissions by 30% by 1998. In the U.S.,
after long stalling, Congress is now considering major revisions to the 1970 Clean Air Act, with
encouragement from the administration. Likely inclusions are a 10 million ton/year or 50% reduction
in sulfur dioxide from 1980 levels by 2000, and a 2 million ton/year reduction in nitrous oxide.

One might ask what impact computer modelling has had on the adoption of such policies? It is hard to
find direct evidence. Indeed, it has been argued that U.S. policy during the 1980s entirely ignored the
results of the half billion dollar's worth of acid rain research, and that minor reductions in emissions
were primarily a quid pro quo with Canada in return for trade and military agreements. However, it
does appear that the Bush administration has felt compelled by the rising tide of public opinion to
underpin its environmental rhetoric with a degree of substance in the proposed revisions to the Clean
Air Act. The heightened public concern about acid rain must at least in part be a function of some of
these research findings. In the European context, the protocols detailing allocation of emissions
reductions by nation may respond in part to computer analyses of transborder flows of pollutants,
although the diplomatic niceties of such agreements makes such relationships hard to pin down. At any
rate there is little evidence to support early fears that obfuscating or misleading computer models
might have a major negative impact on the policy debate. Nowadays policymakers and laypeople seem to
be generally imbued with a healthy skepticism about computer models.

In the last year or two, research funding and media attention to the "acid rain problem" has begun to
wane. This is not because we now understand acid rain effects, or even that the problems will
necessarily be solved by the prospective cuts in emissions. Rather it is because the complexities and
dangers of acid rain, large though they are, pale into insignificance in the face of those of the
greenhouse effect and global warming, now catching the attention of environmental scientists and the
general public. But an understanding of the complex interactions of scientific research, policy models,
public opinion, and policy debate on acid rain over the last decade, may illuminate these processes on
the global warming issue in coming years.

Computer modelling has long played a central role in scientific research on the greenhouse effect. The
complexities of the physical systems involved make this inevitable. We still need to be alert to the
dangers of being misled by inappropriate models. But careful analysis of the uncertainties can protect
us from spurious precision. And the intensity of the scientific debate make it unlikely that major
scientific errors will remain unchallenged for long. A new generation of modelling software could
improve accessibility of policy models, and speed the process of review, critique and revision. In this
way there is even the opportunity that new technologies can open up the collaborative process of model
development, and broaden participation in the policy debate. C:

References

"Evaluating an Information System for Policy Modeling and Uncertainty Analysis," M. Henrion, M. G.
Morgan, I. Nair & C. Wiecha, Journal of the American Society for Information Science, 37(5),1986,
pp. 319-330.

Uncertainty: A Guide to the Treatment of Uncertainty in Quantitative Policy and Risk Analysis, M.
Granger Morgan and Max Henrion, Cambridge University Press, New York, 1990.

"Atmospheric deposition assessment model: Application to regional aquatic acidification in eastern
North America," E.S. Rubin, M.S. Small, C. Bloyd, R. Marnicio, & M. Henrion, Chapter 14 in Impact
Models to Assess Regional Acidification, J. Kamari (ed.), Kluwer Academic Publishers: Dordrecht, The
Netherlands, 1990, pp 253-284.

Max Henrion is an associate professor of engineering and public policy at Carnegie Mellon University in
Pittsburgh. He is currently on leave, and working at the Palo Alto, California laboratory of Rockwell
International, and serving as a consulting professor of medical informatics at Stanford University. He
is a CPSR member.

How Computers Contribute to the Ecological Crisis C. A. Bowers

Recent reports on global changes in life-sustaining ecosystems, such as the annual State of the World
published by the Worldwatch Institute and the special issue of Scientific American entitled "Managing
Planet Earth," support the conventional thinking that computers are one of the most important
technologies we have available for understanding the extent of the crisis and the steps that must be
taken to mitigate it. Processing scientific data and modelling how natural systems will react to further
changes caused by human activity suggest that the computer is essential to a data-based approach to
understanding the dynamic and interactive nature of an ecology. Having recognized the genuine
contributions that computers make to addressing the ecological crisis, I also want to argue that
computers help reinforce the mindset that has contributed to the disproportionate impact that Western
societies have had on degrading the habitat. Put simply, computers represent a Cartesian epistemology
(an argument that has also been made by Hubert Dreyfus, Terry Winograd, and Theodore Roszak), and
the use of this technology reinforces the Cartesian orientations of our cultureÑwhich includes the
critically important aspect of consciousness, wherein the self is experienced as separate from the
natural world.

This Cartesian way of thinking can be seen in how the lead article in Scientific American, "Managing
Planet Earth," frames the nature of the ecological crisis as a problem of more rational management of
the planet. As the author, William C. Clark puts it, "Managing Planet Earth will require answers to two
questions: What kind of planet do we want? What kind of planet can we get?" The italics were added here
to bring out how a Cartesian way of thinking, with its emphasis on instrumental problem solving, also
strengthens the cultural myth, which has roots much deeper in Western consciousness, of an
anthropocentric universe (that is, "man" is the central figure and must treat the biosphere as a
resource for achieving his purposes). The Cartesian mindset shows up in the special issue of Scientific
American and the annual reports of the Worldwatch Institute in another way that is critically
important to any discussion of how computers relate to the deepening ecological crisis. Although both
publications provide a wealth of data which, according to one of the canons of the Cartesian position, is
supposed to be the basis of rational thought, they totally ignore that culture is part of the problem. In
fact, culture is not even mentioned in these data-based representations of the ecological crisis.

This is particularly surprising because culture, understood here as encompassing both the deep layers
of a symbolic world and the whole range of human activities given distinctive form by the shared
symbolic sense of order, is an aspect of every humanly caused change in the ecosystems now viewed as
endangered. Beliefs, values, uses of technology, economic practices, political processes, and so forth,
while varying from culture to culture, relate directly to population growth, loss of forest cover,
destruction of habitats that threaten species with extinction, warming of the atmosphere, spread of
toxic waste in water supply and top soil, and so forth. The irony is that the researchers who provide
useful data and computer simulations of how natural systems will react under further stress, also
contribute to putting out of focus the contributing role that cultural beliefs and practices play in the
ecological crisis.

The Ecological/Cultural Crisis

The phrase "ecological crisis" should be represented as the "ecological/cultural crisis." When viewed
in this way, we can then begin to consider more fully the cultural orientation that is reinforced not
only by the epistemology embedded in the computer, but also by how the computer is represented to the
public and to students. We can then also open up a discussion of whether it is possible, particularly in
educational settings, to create software programs that take into account the deep levels of culture
(including differences in cultures) which give form to human thought and behavior. This latter
possibility, which may well be beyond the capacity of this Cartesian machine, is important to whether
the computer can be used to help illuminate the cultural patterns that are degrading the habitat. But
first we need to identify other aspects of the Cartesian cultural orientation reinforced by the
computerÑwhich has become the dominant icon for representing the authority of a particular form of
knowledge.

The Cartesian mindset has distinctive characteristics that set it apart from other cultures that have, in
a variety of ways, evolved along paths that have been more ecologically sustainable, some for many
thousands of years. This is mentioned here not for the purpose of romanticizing these cultures but,
instead, to bring out that one test of a viable culture is its ability to live in balance with its habitat.
This test is perhaps too pragmatically simple for a culture where the abstract theories of philosophers
have been given, in certain powerful circles, more legitimacy than the contextualized forms of
knowledge that have evolved in habitats lacking a margin of surplus that allowed for experimentation
with abstract ideas. But it is the test that all cultures must now meet as we recognize that our surplus
is increasingly illusory.

The Cartesian mindset, in addition to ignoring the nature of culture (and its influence on thought) and
furthering the view of an anthropocentric universe, has other distinctive elements reinforced through
the use of computers. These include what has become in modern Western consciousness the basis for
objectifying the world (that is, Descartes' distinction between res extensa and res cogitansÑ which also
served to naturalize the cosmos), a view of the rational process where data becomes the basis of
procedural and constructionist thinking, and an instrumental and explicit problem-solving approach to
a world that is posited as mechanistic in nature.

The dimensions of human life ignored by the Cartesian mindset correspond to the weakness in
computers. Contrary to the myths constructed by Descartes, Bacon, Locke, and other thinkers of this
period, a strong case can be made that most of our knowledge is tacit in nature, learned as analogues that
serve as templates for future experiences, encoded in a metaphorical language that provides a shared
schemata for thinking, and represents a collective interpretation framed by the epic narratives that
constitute the basis of the culture's episteme. As we obtain better accounts of other world viewsÑHopi,
Dogan, Koyukon, Confucian cultures in the Far East, and so forthÑit becomes increasingly difficult to
maintain the popularized rendering of Descartes' legacy: the image of a culture-and tradition-free
individual, objective data, and a conduit view of language. The sociology of knowledge (within our own
tradition) and cognitive anthropology point to the cultural basis of thought and behavioral patterns, and
to the way in which each cultural group experiences these patterns as part of their national attitudeÑ
this also applies to the members of our Cartesian culture whose schemata cannot take into account tacit
and culturally constituted knowledge.

Patterns that Connect the Individual

If we turn to the writings of Gregory Bateson, instead of the findings of cognitive anthropology, we find
an account of human existence expressed in the language of science that challenges the conceptual
foundations of the Cartesian mindset and, at the same time, points to the possibility that primal
cultures (like the Hopi, Koyukon, aborigines of Australia, and so forth) may have taken developmental
paths that are more ecologically sustainable. Unlike the modern Cartesian approach to viewing the
rational process as something that occurs in the head of an autonomous, culture-free individual,
Bateson emphasizes the patterns that connect, the information exchanges that constitute the life of an
entire natural/social system of which the individual is a participating member, and the dangers facing
humans when their conceptual mapping processes (what he calls "determinative memory") are unable
to take into account the information exchanges that signal the condition of the ecology upon which they
are dependent. As Bateson put it, "thus, in no system which shows mental characteristics can any part
have unilateral control over the whole. In other words, the mental characteristics of system are
immanent, not in some part, but in the system as a whole." (Steps to an Ecology of Mind, p. 316) His
statement that "the unit of evolutionary survival turns out to be identical with the unit of mind," (p.
483) has a strong echo in the culture of primal peoples where human practices and the natural world
are understood as morally interdependent.

Although it is tempting to dwell further on how a consideration of ecologically sustainable cultures
enables us to recognize those aspects of our own belief system that are contributing to the destruction of
our habitat, it is necessary to turn our attention more directly to the question of whether the use of
computers is really helping us understand the ecological crisis in a way that does not perpetuate the
very mindset that has been such an important contributing factor. At some point, accumulating more
data on the extent of environmental damage and producing better computer models of changes in the
ecosystems becomes a distraction from addressing the real challengeÑ which is to begin the exceedingly
difficult task of changing the conceptual and moral foundations of our cultural practices. We already
know that the trend line reflecting the demands of cultures on the habitat is upward, and that the trend
line reflecting the sustaining capacity of natural systems is downward. More computer-processed data
may enable us to predict with greater accuracy when we will cross certain irreversible thresholds. But
that will be of little use if we cannot reverse the demands made by cultures whose belief systems
represent the environment as a natural resource and human choices as limited only by a lack of data.
The challenge now is to become aware of our own taken-for-granted culture, and to evolve new
narrative traditions that represent humans as interdependent members of the larger information and
food chains that make up the ecosystems.

Computers, the Environment, and Education

The use of computers in educational settings seems to be where the question of relevance can be most
clearly raised. As educational software ranging from databases to simulation programs have been
written by people who are embedded in the Cartesian/liberal mindset (objective data, autonomous
individuals who construct their own ideas, progressive nature of rationally directed change and
technological innovations, a conduit view of language) it may be premature to reach the conclusion that
the educational uses of computers can only reinforce the Cartesian mindset that has helped,
paradoxically, to create a form of technological empowerment that contributes to the possibility of our
own extinction. As Theodore Roszak points out, the basic relationship in the educational use of
computers involves the mind of the student meeting the mind of the person who wrote the program, and
the mental processes that establishes what constitutes the "data." If the mind encountered by students,
mediated of course by the amplification characteristics of computer technology, has never considered
the aspects of human/culture experience ignored by Cartesianism, it would be impossible for the
students to write a program that takes into account the deeper levels of culture. Or, for that matter, it
would be impossible to frame the thought process in a way that enables students to recognize that
language and thought are influenced by the episteme of a cultural group.

The close connection between computers and the form of consciounsess associated with print technology
make it impossible to represent the thought processes of other cultural groups in a way in which
students could enter into its epistemic patterns at a taken-for-granted level. As Eric Havelock and
Walter Ong argue, print makes what is represented appear as dataÑabstract, decontextualized, and
rationally apprehended. But it should be possible to move some distance away from the more stultifying
aspects of the Cartesian mindset reinforced through print-based discourse. Software programs that
help illuminate the nature of culture would seem to be a step in the right direction, both in terms of
understanding the symbolic foundations upon which thought and social practices rest, and in terms of
recognizing that culture is part of the ecological crisis. One aspect of culture that needs to be
illuminated, which would be a prelude to considering comparative belief and value systems, is the
metaphorical nature of language. Particularly important would be understanding how the root
metaphors of a cultural group (for us, a mechanistic image of nature) influence the process of analogic
thinking (i.e., choice of generative metaphors) and leads to the existence of iconic metaphors that
encode the earlier process of analogic thinking. Iconic metaphors such as "data," "artificial
intelligence," and "computer memory," are examples of this process of encoding earlier processes of
analogic thinking, which in turn was influenced by the root metaphors taken for granted at that time.
How the metaphorical nature of language provides the schemata for thinking becomes especially critical
to the process of recognizing how current thinking about the ecological crisis largely is framed by the
metaphors central to Cartesianism. Viewing language as encoding the process of analogic thinking also
brings other aspects of culture into consideration: how people in our own past as well as members of
other cultural groups have different views of reality, how the past can influence the present at a taken-
for-granted level, and how the individual is, in actuality, giving individualized expression to shared
cultural patterns. Becoming aware of culture, it should be kept in mind, is just the first step in a
process that must eventually engage the more politically difficult problem of sorting out the cultural
patterns that are ecologically sustainable over the long term.

There is another line of development in educational software that may be fruitful to explore. This could
involve the use of problem-solving simulations framed in terms of the patterns of thinking of other
cultural groups who have lived within the limits of their habitats (this would help students recognize
the assumptions of our culture that ignore the problem of long term interdependency) and the use of
simulations that consider the future ecological impact of our assumptions about human life, material
and technological progress, and rational control of the environment.

The Moral Poverty of the Information Age

With the cultures of the world placing increasing demands on biosystems that are showing signs of
disruption and decline, the most critical aspect of the problemÑat least in terms of the human/cultural
roots of the crisisÑis to change the root metaphors that underly the foundation of our Western value
system. Serious consideration, for example, should be given to Aldo Leopold's argument that a land ethic
should replace the anthropocentrism of the value orientation that now guides individual decisionsÑ
including our uses of technology. Very succinctly, he argues that an ethical consideration of our
interdependency with the environment, if taken seriously, should lead to "a limitation on freedom of
action in the struggle for existence." Restriction of self for the sake of others, where "others" is
understood as including the entire biotic community," now is paramount to human survival, given the
size of the world's human population and the scale of its technological capacities.

What this will mean for how we use computers is not entirely clear at this time, but one point that now
seems irrefutable is that the future has a moral dimension to it that is ignored by the image of an
"Information Age." The moral dimensions of the ecological crisis bring us back to a central theme of this
discussion: namely, that "data" and simulation models tend to hide the deeper levels of culture. The
transmission of culture, which occurs whenever a language system is used as part of computing
process, points to a need to consider the cultural orientations that are being reinforced by this
technology, and to asking whether it is part of the solution or part of the problem. The consequences of
taking these concerns seriously are so important that they need to be given a more central place in
future considerations of the educational use of computers and in understanding the influence of this
technology on social change.

C. A. Bowers is professor of education at the University of Oregon in Eugene, Oregon, and the author of
The Cultural Dimensions of Computing: Understanding the Non-Neutrality of Technology (1988).

Sematech, Toxics, and U.S. Industrial Policy Why We Are Concerned Lenny Siegel, Ted Smith, and Rand
Wilson

Dramatic changes in Eastern Europe have initiated the global warming of the cold war. These changes
present an opportunity to redefine national and global security in terms of domestic needs rather than
continued cold war political posturing. With economic, social and environmental problems in the U.S.
requiring immediate attention, a real debate about U.S. industrial policy for the 90's is emerging.

At the same time, the American high technology manufacturing sector is being buffeted by fierce global
economic competition. Business leaders are demanding relief, but Bush administration ideologues are
opposed to any government intervention. Meanwhile, citizens and workers affected by high tech
industrial development are crying out for help with significant economic, environmental and
occupational health problems.

Within this context, the movement to combat toxics has emerged as one of the most energetic and
potentially powerful social forces of the 90's. Following in the footsteps of earlier labor, civil rights,
peace, and women's movements, the grass-roots environmental movement is galvanizing the American
people to develop pollution prevention strategies that require the development of clean technologies. As
a social movement, we are challenging fundamental conceptions about the organization of production and
consumption in our economy.

All of these threadsÑthe thawing of the cold war, the emerging debate on industrial policy, the increase
in global high-tech competition, and the growing demand for toxics use reduction intersect at
SematechÑa high tech industry/ Pentagon research consortium. For business leaders, Sematech holds
the promise of cooperative high tech innovation and the key to "competitiveness." For environmental
advocates, Sematech may provide a model for developing long term solutions to eliminate high tech
hazards. What role should Sematech play in the industrial policy debate?

What is Sematech?

Sematech is a non-profit consortium of fourteen U.S.-owned semiconductor manufacturers, based in
Austin, Texas.1 Half of its annual budget comes from member companies. The other half is paid by
taxpayers through the administration of the Department of Defense.

The leaders of the U.S.-owned semiconductor industry formed Sematech in March 1987 to challenge the
growing success of their Japanese-owned competitors. Though American-based firms retained their
technological edge in chip design, Japan-based companies perfected techniques for the efficient,
reliable fabrication of state-of-the-art integrated circuits.

Superior Japanese manufacturing technology had catapulted Japanese-based firms to dominance in the
world market, especially for dynamic random access memory (DRAM) chips that are common in
computers and mass-marketed worldwide. Consequently, the Japanese now hold more than half the
world market for semiconductors (compared to 28% in 1978). Six of the top ten merchant producers
are also now Japanese.2

Sematech's primary goal is to develop methods and machinery for squeezing more and more circuit
elements onto each flake of silicon.3 Building circuitsÑand ultimately computersÑwhich are faster and
smaller is seen as one of the main goals of the high tech quest. Sematech hopes to replicate patterns
featuring linewidths of only 0.50 microns by 1992, and 0.35 microns by 1993Ñcompared to today's
standard of 0.8 microns.4

Building smaller and faster circuits however, requires the use of more solvents and other chemicals to
achieve the necessary requirements for ``clean,, components. As the geometries of production decrease,
more solvents are needed to wash away ever smaller "killer particles" that could jam a circuit. Smaller
and faster may also mean using even more toxic chemicals.

In addition to its own 500 employees, Sematech works with engineers from member companies.
Through the Semiconductor Research Corporation, it funds nine "Centers of Excellence" at U.S.
universities. In cooperation with the Semiconductor Equipment Manufacturers Institute (SEMI), it
sponsors the development of new chipmaking equipment at U.S.-owned companies that specialize in
production equipment.5

Why Federal Aid for Semiconductor Manufacturers

Arguing that the international competitiveness of the American semiconductor industry is critical to
the economic and military future of our country, Sematech's founders sought Federal aid. They
convinced Congress to allocate $100 million per year for five years to Sematech. Among the strongest
supporters of Sematech are liberal members of Congress who see this program as a giant step toward a
comprehensive national industrial policy.

Sematech's supporters channeled the funds through the Pentagon's Defense Advanced Research Projects
Agency (DARPA) because federal-level politicians believed no civilian agency could win as much
political support as the Pentagon. And DARPA, particularly in the 1960s and 1970s, had shown itself
capable of managing technologies with civilian, as well as military applications.6

Tension between DARPA and member companies delayed Sematech's start-up, but over its first three
years of operation, chipmakers were generally pleased with the agency's approach. In April of this
year, however, Pentagon officials transferred Craig Fields, director of DARPA, reportedly for his
backing of programs to support civilian industry. Fields subsequently left the Department of Defense.

High Tech: A Major Pollution and Health Threat

Manufacture of computer components involves the use of many hazardous substances. The process of
turning silicon into semiconductors is replete with dangerous conditions. Many serious accidents have
occurred in the semiconductor industry that have resulted in industrial illness and significant
environmental contamination. The industry uses toxic gases, solvents, etchants, heavy metals and
volatile organic compounds that can adversely impact workers, communities and the environment. In
addition, huge amounts of contaminated waste by-products are generated and must be handled and
disposed of properly. Recent data from Silicon Valley reveals that the local industry has disposed more
than 100,000 tons of hazardous waste off-site, and discharged over 12 million pounds of toxic waste
into the environment.

The record of the microelectronics industry has been very poor. Improper handling of toxic substances
in the workplace has harmed both workers and the environment. Employees of electronics firms are
exposed to hazardous materials through spills, accidents, and chronic exposure that can produce severe
burns, respiratory problems and immune system impairment.

Whereas much of the manufacturing and assembly takes place in "clean" rooms, these rooms are
maintained to assure particle-free products rather than for the benefit of workers' health. In the
highly competitive field of semiconductor manufacturing, production demands too often outweigh
worker safety. The result is a worker illness rate reported to the California Division of Industrial
Relations that is three times that of other manufacturing industries.7 Likewise, an epidemiological
study at Digital Equipment Corp. found a miscarriage rate among production workers that was twice as
high as the "norm."

The semiconductor industry has also been a principal cause of groundwater pollution in the Silicon
Valley, the acknowledged "home" of the industry. Silicon Valley has 29 federal Superfund sitesÑmore
than any other area of the U.S.. Most were caused by improper handling of toxic solvents used by
semiconductor firms. More than 150 underground chemical leaks have contaminated over 200 public
and private drinking water wells.. In addition, Silicon Valley has more toxic gas storage and usage and is
responsible for discharging more ozone-destroying CFC's than any other area in the country.8

How Sematech Could Help Workers and Communities

Many major semiconductor producers are now spending large sums of money on the cleanup of spills
and leaks, and most are struggling to cooperate with new laws designed to reduce the risk of
environmental exposure. But the structure of the industry and the short term economic demands it
faces has limited progress in the design of safer production methods.

Chip production is intensely competitive, so companies focus almost all their manufacturing technology
efforts on cutting costs and increasing reliability. Few companies have the additional resources to risk
on entirely new ways of doing things. In addition, most process technology is introduced or improved by
even smaller, independent equipment producers. These firms may be well situated to introduce new
methods, but since they don't currently pay the price of pollution or pollution-control, they have no
incentive to explore alternatives.

Sematech, by bringing producers together under one roof, allows companies to share the risk of new
process development. And by issuing development contracts to equipment makers, Sematech can ask for
or specify pollution prevention objectives directly. For instance, the consortium could help coordinate
goals and timetables for phasing out reliance on toxic solvents, gases, glycol ethers, and other
hazardous production chemicals.

Research in California has shown that the average life expectancy of a high tech facility is six years
compared to thirteen for all other manufacturing in the state. Sematech could also play a leading role in
coordinating investment to maximize community economic stability High tech's ability to move
production as if plants and workers were pawns in a global chess game has left communities
economically devastated by sudden relocation of work. Sematech could help develop economic impact
statements regarding the longevity of manufacturing facilities and skills needed. These Economic Impact
Statements could accompany investment proposals where state or federal support was desired.

New Directions for U.S. Industrial Policy

Electronics companies and their Congressional allies are challenging the Bush administration's
hostility to civilian industrial policy. Some are even calling for a civilian counterpart to DARPA, a
Commerce Department agency that would fund Sematech and other commercially oriented high-tech
programs. Wharton School of Management Professor Bruce Merrifield, former Assistant Secretary of
Commerce for Technology Policy, has called for such a commerce-based agency that would fund up to
20% of the R&D costs for civilian led technological development.

In the long run, civilian control over Sematech funding would free the consortium from the anomalies
of Pentagon spending. However, funding Sematech is good industrial policy only if the consortium
redesigns its research program to provide good jobs for American taxpayers and it develops
technologies to reduce the risk of worker illness and environmental degradation.

More than just civilian oversight of Sematech is needed. Neighbors of semiconductor companies and
representatives of the high-tech workforce should be directly involved in decision making. Only
substantial citizen participation in industrial policy will lead to the economic rejuvenation that policy
makers and chip manufacturers seek.

Semiconductor Bailout: The Wrong Medicine?

Lack of government support did not cause the semiconductor industry's problems. Rather, the type of
subsidy and guidance that the government gives has actually contributed to the current state of affairs.
Hefty U.S. military contracts direct manufacturers towards technology that is rarely transferable to
production for civilian markets.

Most military-funded high-tech projects over the last decade (as opposed to the 1960s) have been of
little benefit to the commercial sector. Other military-funded programs are designed to develop chips
exclusively for military use, or for short-batch chip manufacturing technology. However, Sematech is
an important exception. Unlike many other military programs, up until now, DARPA has allowed
Sematech to focus on commercially useful technologies.

Sematech wasn't approved because of the military importance of chips. Instead, most members of
Congress thought it would help outpace Japan economically. Sematech's military significance is mostly
used as an excuse by people who are reluctant to support a generalized national industrial policy and
are looking for a way to consider semiconductor production an exception.

Grass Roots Participation Needed

Sematech, if supported at all by the government, must be managed by another Federal entity. Pentagon
oversight sets (or reinforces) a precedent that DoD is the best agency of industrial policy. Further, as
the Pentagon is forced to cut back, pressures to tailor Sematech to other DoD research goals will
undoubtedly increase. Most importantly, there is no mechanism through which citizens groups and
workers can shape the objectives of Sematech. DARPA just isn't used to letting many union
representatives and environmentalists sit on its advisory committees.

In response to their problems, the U.S. semiconductor industry calls for trade sanctions, increased
economic protection, and exemptions from U.S. anti-trust legislation. And, as is typical in other U.S.
industries, the semiconductor industry blames Japan for its failures. This nationalistic approach not
only ignores the root causes of the industry's problems, but obscures the decidedly multi-national
character of the semiconductor industry. While the U.S. semiconductor industry is always receptive to
a government handout, many firms are simultaneously in the process of cutting their own deals abroad.
Many major U.S., Japanese, and European firms have already linked up their businesses in joint
ventures, and nearly all U.S.-owned merchant chip makers do most of their assembly offshore.

A more genuine effort to address the problems of the U.S. semiconductor industry would also address the
self-defeating short-term demands of the U.S. economic structure. High interest rates for R&D and
capital equipment financing have contributed more to the demise of the U.S. chip industry than any
other factor. U.S. investors are much more impatient for high returns on their investments than their
Japanese counterparts, who seem willing to accept short-term losses and wait for long-term gains. As
a result, the Japanese are already way ahead in developing manufacturing technology that will dominate
the 1990s.

The Campaign for Responsible Technology

The Campaign for Responsible Technology (CRT) is concerned with the environmental and health
implications of the microelectronics industry and, more broadly, the role of Sematech in national high
tech industrial policy. We believe that the work that Sematech is engaged in must include a strong
commitment to making the research, manufacture and deployment of chip manufacturing technologies
safer and less reliant on toxic substances.

Members of the campaign represent a broad cross-section of citizen, occupational health,
environmental and labor groups that are concerned about these problems. We are primarily from
geographic areas in the U.S. where microelectronics research, development and production is
concentrated.

We believe that Sematech is in a strategic position to help implement badly needed reforms within the
semiconductor industry. We believe that Sematech should:

¥ develop new technologies and processes that will be less dependent on toxic chemicals;

¥ introduce new health and safety procedures for microelectronics workers that will reduce the
industry's shocking rates of occupational illness, reproductive hazards and diseases;

¥ provide increased information and education to Sematech member companies concerning health and
community hazards, and appropriate solutions;

¥ ensure that scientists, engineers, and other Sematech staff are being trained in source reduction and
toxic use reduction methods, and that they include occupational and environmental health factors in the
design of new products and production processes;

¥ develop safeguards to insure that taxpayer investments in new technology create jobs for U.S.
workers;

¥ create an advisory board composed of labor, environmental, and health professionals to oversee
Sematech's environmental and occupational health programs.

¥ evolve towards a funding base less dependent on the Pentagon.

For more information about the Campaign for Responsible Technology contact: Ted Smith, Chair, Silicon
Valley Toxics Coalition, 760 North First St., San Jose, CA 951 12-6302; or Rand Wilson,
Coordinator, Labor Resource Center, 186 South St. 4th Fl. Boston, MA 02111.

Notes

1 . The fourteen member companies are Advanced Micro Devices, AT&T, Digital Equipment, Harris,
Hewlett Packard, Intel, IBM, LSI Logic, Micron, Motorola, National Semiconductor, NCR, Rockwell, and
Texas Instruments. These fourteen member firms represent approximately 80% of U.S. semiconductor
manufacturing capacity at 67 semiconductor plant sites in the U.S. Sematech's annual budget is over
$200 million dollars.

2. Merchant firms sell chips on the open market, while captive semiconductor producers, such as IBM,
produce for intra-company consumption.

3. Originally, some of Sematech's founders wanted to mass produce chips for sale on the open market,
but a few of the larger participants insisted that Sematech's own plant be used only to prove new
technologies in small, demonstration runs. Sematech's own chips are discarded, but the technologies
developed for their manufacture are supposed to be disseminated speedily to member companies.

4. A micron is a metric unit of linear measure which equals one millionth of a meter.

5. Sematech has created the Tool Application Program (TAP) and an Equipment Improvement Program
(EIP) to help U.S. equipment and materials suppliers develop new, or enhance existing, manufacturing
equipment. Sematech's TAP and EIP Programs support suppliers by providing a manufacturing area
within Sematech's fabrication facility as well as a team of engineering and manufacturing specialists to
provide technical support and analysis. Some of the companies that have benefited from these
relationships include ATEQ Corporation, Beaverton, OR; Eaton Semiconductor Equipment Division,
Beverly, MA; Union Carbide Industrial Gases, Inc, (Linde Division) Danbury, CT; Semi-Gas Systems
(subsidiary of Hercules, Inc.) San Jose, CA; Wilson Oxygen and Supply, Austin TX; GCA, Andover, MA;
Hewlett Packard, Palo Alto, CA; NCR, Dayton, OH; ORASIS Corp., Sunnyvale, CA; Westech Systems Inc.,
Phoenix, AZ; Lam Research, Fremont CA; Genus Inc., Mountain View, CA.

6. ARPA, DARPAÕs predecessor, funded computer science research that help make possible such
commonly used breakthroughs as computer time-sharing, interactive video displays, and the mouse.

7. Data on occupational illness rate compiled from CAL-OSHA statistics.

8. For more information on the pollution threats from the manufacture of computer components see
Phil Woodward and Ted Smith, The Toxic Life-Cycle of Computer Manufacturing: The Legacy of High-
Tech Development (Silicon Valley Toxics Coalition and The National Toxics Campaign, 1990).

Lenny Siegel is Director of the Pacific Studies Center in Mountain View, California. Ted Smith is Chair
of the Silicon Valley Toxics Coalition in San Jose, California, and he is vice president of the National
Toxics Campaign. Rand Wilson is Coordinator of the Labor Resource Center in Boston.

Simulation and the Systems Approach

Elisabeth C. Odum, Howard T. Odum, and Nils Peterson

As human responsibility for the environment of the Earth increases, more of us must learn how the
Earth system functions and what we can do to sustain it. To make reasonable decisions, we need to
understand how global and local ecological systems interact with economic systems.

An effective way to find solutions is to make models and computer simulations to gain understanding.
First, we make models of the ecological and economic systems. We test the models to see if they act like
the real systems in nature and the economy. If they do, we perturb them, as humans have perturbed the
real systems. Then we try various solutions on the models. All during the process we check our
accuracy against observation and experience. Finally, we can make recommendations for policy changes.

This method of making and running models is useful for teaching principles of science and systems, as
well as how approaching practical problems in environment and economics. The kinds of public policy
suggestions which come from the models can be discussed in classrooms and used in the "real world."

A new computer program, EXTEND, makes it easy to experiment with models on the Macintosh
computer.' A mouse is used to connect icons on the screen. After the model is drawn, simulation graphs
are produced. The effects of various changes can then be shown by making "what if" changes in the
program. In this article we will show how changes in natural systems which are disturbed by humans
can be modeled and simulated using EXTEND.2 This article describes the new approach but does not give
enough detail to teach it or do it without more specifics. Teaching materials and references are listed at
the end of the article.

Overfishing: A Sample Model

We will build a model of a pond ecosystem, show how it functions naturally, and then demonstrate how
it changes when it is over-fished. First, EXTEND is called up on the Macintosh along with a library of
ecological program blocks which we prepared. Using a mouse, icons representing the parts of the
system are called up on the screen from our library of ecological blocks and connected.


Bass

Figure 1

The resulting pond model is in Figure 1. When you click on each icon, a dialog box appears in which you
enter the appropriate numbers, as in Figure 2 where you can change the quantity of sunlight. When all
the numbers are entered, you tell the program to run the simulation, to plot the increase of pond life,
sunfish, and bass (Figure 3).

Figure 2

Figure 3

Then we add an icon of fishermen fishing the bass (Figure 4). When the program is plotted after fishing
is started (Figure 5), the quantity of bass goes down, bringing the fishing yield down. Note the data
table under the graph; it keeps track of weights of components with time.

Figure 4

Figure 5

To conduct "what if" experiments, various changes can now be made. For example, if you spent five
times more hours fishing, how would this affect your bass catch and the quantity of bass left in the
pond? Compare the yield in Figure 6, with five times the fishing time as in Figure 5.

Figure 6

Figure 7

Principles of Environmental Systems

As well as illustrating real environmental situations, this modeling process demonstrates principles of
environmental systems.

The most important principle is that of interdependence: systems and parts of systems interact with and
affect each other. In the pond the sun flows into the pond to stimulate photosynthesis of algae and water
plants, which are eaten by small invertebrates. The energy from this pond life is used by the sunfish
and then by the bass. Wastes from the animals are used as nutrients by the plants.

Food webs can illustrate the principle of hierarchy, where many small items go to support and be
controlled by fewer large items. In the pond example, many small photons of sunlight on the left in
Figure 1 are concentrated up the web to fewer bass at the right. Organisms with faster turnover
(shorter lives) are on the left and those larger and slower on the right. Hierarchy is found in ail
systems, with the top of the hierarchy controlling the next unit below it. The energy of the sunfish is
consumed by the bass and the bass control the quantity and quality of the sunfish. The top of the
hierarchy in Figure 4, the fishermen, control the quantity of bass.

The quantities used in the models illustrate the first and second laws of thermodynamics. The first law
states that energy changes form but can always be calculated, as 803 kilocalories of sunfish make 93
kilocalories of bass, with 710 kilocalories going out of the system in waste heat. These figures also
illustrate the second law, which states that in every process energy is depreciated as waste heat.

A characteristic of all systems which survive is recycling. A model of a grassland system illustrates
recycling. Follow the icon diagram in Figure 7. Respiration of grassland creatures produces inorganic
nutrients which recycle back to be used by the grass in photosynthesis. Materials flow around the
system, with wastes from one part of the system necessary for production of another part.

All products must recycle and be used by some part of the system. If wastes are not reused, the system
will not last. The buildup of waste is one of our most serious human environmental problems. One
proposal is that if a product or chemical cannot be recycled, it should not even be made.

Chemical processes are illustrated by the diagrams, too, such as photosynthesis by plants and
respiration by all living organisms.

A Logging Model

Another environmental problem which can be looked at with a model and simulation is cutting logs in
forests. The question in forests across the United States and the world is the same as the fishing
question. How much resource can be taken from the natural system and still keep it sustainable?

Figure 8

A simulation of the logging system over 100 years is shown in Figure 8. You can look at the results
from different points of view. If you are a conservationist, you may be upset because the forest never
grows back to its original biomass. If you are a forest products company manager, you will be more
interested in the money to be made with the yield after 5 years. The owner or investor may be
interested in the long-run money to be made.

In this model changes of prices can be put in "what if" experiments. For example, what will the yield of
logs and money return be if the selling price for logs doubles? Figure 8 is the run with original prices
and quantities. The growth of the forest stays low, but yield and profits go up.

Discussion of this model leads to broader international trade questions. If these logs are being exported
to Japan, is it to our advantage or to theirs? In Japan the average quantity of goods which people can
buy for the dollar is less than in the U.S. If we sell our logs to Japan and spend the money to buy
Japanese goods, this trade is equal in money. But, when you look at the exchange of real goods, the trade
is to Japan's advantage.

Notes

1. EXTEND is a general purpose simulation modeling program available from Imagine That Inc., 7109
Via Carmela, San Jose, CA 95139.

2. This work was done as part of the BioQuest project, centered at Beloit College. The systems symbols
for use with the program EXTEND are available from Center for Wetlands, Phelps Laboratory,
University of Florida, Gainesville, FL 32611.

References

Odum, E. C. and H. T. Odum. Energy systems and environmental education. pp. 213-229 in
Environmental Education, ed. by T. S. Bakshi and Z. Naveh. Plenum, NY: 1980.

Odum, H.T. Unifying Education with Environmental Systems Overviews. pp. 181-199 in
Environmental Science, Teaching and Practice. Proceedings of the 3rd International Conference on the
Nature and Teaching of Environmental Studies and Sciences in Higher Education held at Sunderland
Polytechnic, England, September, 1985.

Odum, H. T., E. C. Odum, M. T. Brown, D. LaHart, C. Bersok, J. Sendzimir, G. B. Scott, D. Scienceman,
and N. Meith. Environmental Systems and Public Policy, Florida Supplement. 1987. Available from
Center for Wetlands, Phelps Laboratory, University of Florida, Gainesville, FL 32611. (An
introductory text.)

Odum, H. T., E. C. Odum, M. T. Brown, D. LaHart, C. Bersok, J. Sendzimir, G. B. Scott, D. Scienceman,
and N. Meith. Energy, Environment and Public Policy, a Guide to the Analysis of Systems. 1988.
Available from Center for Wetlands, Phelps Laboratory, University of Florida, Gainesville, FL 32611.
(An introductory text.)

Odum, H. T., "Simulation models of ecological economics developed with energy language methods,"
Simulation 1989 (August): 69-75.

Elisabeth C. Odum is a professor in the Department of Biology at Santa Fe Community College in
Gainesville, Florida. Howard T. Odum is a professor of environmental engineering sciences at the
University of Florida in Gainesville. Nils Peterson works for From the Heart Software in Eugene,
Oregon.

Balance of the Planet Simulation by Chris Crawford Software
review by David Krieger

The icon on the desktop lets you know you're taking on a prodigious task: there's Mother Earth, cradled
in two huge handsÑyours. This is Chris Crawford's "Balance of the Planet," and you have been appointed
"High Commissioner of the Environment" by the United Nations, with powers to tax industries and
disburse subsidies toward the goal of preserving the ability of the Earth to support life. Your policies
are implemented over five-year turns, and the results presented to you: did more or fewer people
starve due to your actions? How many skin cancers resulted from the deteriorating ozone layer? Did
the level of the oceans rise or fall due to global warming and polar melting? How many species became
extinct?

In "Balance of the Planet", Chris Crawford, author of the popular Cold War simulation "Balance of
Power," has designed a sophisticated and challenging introduction to the complexities of environmental
policy and even ecological science. The game was released on Earth Day, April 22, 1990, to promote
ecological awareness and stimulate discussion. If, like me, you found most of the media "events"
associated with Earth Day to be trivial, embarrassing (the tons of trash left behind at Earth Day
rallies, for example), or just plain dumb, you will revel in "Balance of the Planet" for bringing some
rigor to the discussion. It's a meaningful way to promote the objectives of Earth Day.

How The Game Is Played

You are awarded points based on the state of the Earth: positive points for things like diversity of
species and a sustainable energy economy; negative points for evils like starvation and devastation of
the forests. The game starts you out in 1990 with zero points; the various plus and minus factors are
all mutually balanced. This is deceptive: one of the central points made by the simulation is that things
are already bad and getting worse, and it will take years or decades of dedicated effort to turn them
around. No matter what course you choose, the results of your first few turns will be uniformly
dismal, with your score diving into thousands of negative points. You have until 2035 to turn things
around. A positive score at that time wins; a negative one loses.

"Balance of the Planet" bestows on you the power to tax various products that affect the environment,
such as fossil fuels, chlorofluorocarbons (CFC's), heavy metals, fertilizers, and nuclear energy. You
may distribute the revenues thus raised to a number of environmental subsidies, including family
planning, basic and applied scientific research, and Third World debt-for-nature programs. You are
also held liable for property damage resulting from your policies, such as beachfront property that
gets submerged if the ice caps melt. Your powers have limits: you may raise or lower taxes by no more
than a factor of four in each turn, and you may spend no more on subsidies than you raise in taxes (you
specify subsidy levels as a percentage of total expenditures). When you enter and execute your desired
policies, the program calculates the likely consequences and scores you on the results.

Behind the game-playing interface lies the next level of complexity in "Balance of the Planet": the
intricate web of cause and effect which you're trying to guide with your policies. The quantities you
controlÑtaxes levied and subsidies paid outÑaffect other variables: your Debt-For-Nature subsidy
affects the rate of Forest Clearing; Forest Clearing in turn affects Carbon Dioxide levels, amounts of
Farm Land and Forest Land, and Soil Erosion. Carbon Dioxide levels affect Global Temperature, while
available Farm Land affects Crop Yields and so on. The final results of these calculations are the points
assessed against you for various bad things like Starvation Deaths and Land Abuse. Screens explaining
each variable are laid out in a hypertext format that reflects the structure of their interrelationships;
you can explore this causal network and view the underlying equations that determine the way the
numbers fall.

At the next level of involvement, you can modify these equations, both in the way environmental
quantities affect each other (you may disagree with the rate of skin cancer due to ultraviolet exposure,
for instance), and in the way environmental quantities relate to points scored (you may decide that
preserving the global gene pool deserves relatively fewer points than saving human lives). This is one
of the most important features of "Balance of the Planet"ÑCrawford and his development team recognize
that any simulation incorporates the biases of its programmers, so they make it possible for you to
incorporate your own biases into the simulation and see how things turn out. They provide you with
several ready-made alternative bias files you can loadÑan Environmentalist bias, a Third-World bias,
a Pro-Nuclear bias, and an Industrialist bias. For instance, the Industrialist bias assumes things are
just fine the way they're going (starvation deaths count for an infinitesimal penalty) and starts you off
with 10,546 points instead of zero!

The Interface

"Balance of the Planet" makes good use of the Macintosh interface; it is visually both stimulating and
informative. (The simulation will also be made available for IBM-compatible machines.) Every screen
in the above-mentioned hypertext features a bar-graph over time for that particular variableÑso you
can see the long-term effects your policies have had on, for example, beef production. While the rest of
the game content is largely textual, every screen also incorporates attractive and relevant artwork
(with color for Mac 11 users). The art director for the program was Amanda Goodenough, creator of the
Inigo series of interactive children's stories in HyperCard. All input to the game is by mouse: scroll
bars set tax values and subsidy rates.

Some minor aspects of the game violate good Macintosh interface design: for example, there is no "Undo"
or "Cancel" available when adjusting a tax or subsidy; if you alter a value but immediately have second
thoughts and want to leave it unchanged, you can only approximate in resetting the previous value.
Navigating in the hypertext is all done by pointing and clickingÑexcept returning to the Policies or
Results screens, the most frequent actions in playing the game, which inconsistently require dragging
down a menu. There is no "New Game" menu item; you must quit and restart the program. These are
minor complaintsÑthe game plays very smoothly overall.

How Faithful Is the Simulation to Reality?

"Balance of the Planet" addresses a range of extremely complex problems, from the chemical
interaction of gases in the atmosphere to the incidence of cancer as a function of ultraviolet exposure.
The causal relationships in many of these areas are still not fully understood, and for those we do
understand, the math can be hairy. In the witty and educational manual that accompanies "Balance of the
Planet," Crawford admits that many of the approximations incorporated are only educated guesses. Also,
the program must run on a personal computer and have a finite response time. Numerical integration of
second-order differential equations is therefore out of the question.

Having studied geophysics (including atmospheric modeling) at the graduate level, I can say with some
confidence that Crawford has done an admirable job of reducing a set of complicated phenomena to a
tractable system of (mostly) linear equations that give a surprisingly rigorous first-order
representation of reality. The simulation is, after all, an educational game, not a scientific instrument.
It's fun and educational; the numbers do not need to be better than approximately correct. And, since the
proportionality constants in the equations can be changed, users with advanced degrees in any of the
relevant fields may tinker with the equations to their own satisfaction.

What about the political and social assumptions incorporated into the native state of the simulationÑthe
implicit bias of the program before any of the explicit bias files are loaded? It appears that, as a
futurist and environmentalist, Crawford is a moderate. The "native" bias doesn't incorporate the rabid
"Earth First!" mentality or the industrial phobias of the Greens. Neither does Crawford seem to be a
techno-optimist of Peter Vajk's "Doomsday Has Been Cancelled" variety: "technological optimism," a
variable in the equations, is limited to a fairly low level (relative to my own prejudices). This
moderate course will naturally get Crawford in trouble with extremists of both types.

For example, Crawford is likely to take a great deal of flak from the anti-nuclear lobby for the mild
effects attributed to nuclear powerÑthe winning strategy I discovered (playing the "native" bias)
relies heavily on nuclear energy to uphold the world economy and the lifestyle of the industrialized
world, and on untaxed pesticide use to raise more food and to combat starvation in the developing world.
This strategy wins with a score of over 11,800 points (when combined with other appropriate taxing
strategies and the proper subsidy distributionÑI don't intend to make things that easy for you). The
negative points that result from radioactive waste and nuclear-plant accidents are far outweighed by
the positive results of banning fossil fuels: fewer lung-disease deaths, reduced global warming, and a
more sustainable energy economy. I don't believe that Crawford's assumptions are in errorÑhe has
obviously researched the simulation quite thoroughly, and the manual features a rich bibliography.

The Big Picture

What is the significance of "Balance of the Planet?" It is indisputably of great educational valueÑ
despite the mathematical concerns discussed above, the cause-and-effect relationships explored are
qualitatively correct and not, at present, widely-known. The abundant text provided, both in the
hypertext of the simulation and in the entertaining documentation, contains much that is surprising and
appalling about the current and potential future state of the world. It epitomizes the good computers
that can do as aids to education and awareness, and as tools to present complex relationships clearly.

Not incidentally, it's fun. The post of "High Commissioner of the Environment" is the closest most of us
will ever come to absolute power. It's gratifying to see the starvation curve level off and drop as a
result of your policies. It's fascinating to see a variable take a sudden, unexpected plunge or rise, and to
then track down the one crucial relationship you overlooked. It is thrilling to see the effect that even
one percent of your budget can have when used to subsidize, say, research to improve wood-stove
technology.

It is also sobering. Exactly such decisions are being made todayÑconsciously and unconsciously;
individually, by those with power, and collectively, by the rest of us. Without an enormous
mobilization of resources, the quality of life and the environment threatens to decline severely in the
near future. The trends that threaten the environment today have been building for decades, and decades
of purposeful action will likely be required to reverse them. The longer we delay before taking the
necessary steps, the more terrible the situation will grow for our children and grandchildren before it
gets betterÑif it will ever get better.

I found "Balance of the Planet" to be quite provocative. Several aspects of the simulation are inherently
controversial, and can provoke heated and impassioned discussion about environmental issues and
policy. Scoring the player's performance requires the simulation to quantify the value of a human life.
How many acres of rainforest is a human life worth? How many barrels of oil? How many species of
ocean life? Addressing the questions raised by "Balance of the Planet" requires a close examination of
one's own values and one's wishes for the future.

Is the program likely to goad people to action? This question is problematic. Because of the effort to
avoid bias in the simulation, it is easy to re-program "Balance of the Planet" to tell you whatever you
want to hear. According to the "Industrialist Bias" provided, things are great and getting betterÑso why
worry? Or, if you believe that the keys to saving the Earth are solar power and recycling, you can
adjust the equations to make that a winning strategy. In short, the program is a weather vane that can
be made to point in the direction the wind is already blowing you.

I think impartial players will find Crawford's "native bias" compelling. Our choices really are few:
there are a limited number of energy sources available; there is a runaway population problem; there
are specific substances that are dangerous to have running around loose in our atmosphere. The sad
aspect is that there will, of course, be no "High Commissioner of the Environment" appointed; that the
trusty, rusty wheels of "public will" and a hundred national governments are the only mechanisms in
place with a chance of successfully combating the destruction of the biosphere. Let us hope that "Balance
of the Planet" reaches enough people to set those mechanisms moving before it's too late.

David Krieger is a recent graduate of the master's degree program in information science at UCLA's
School of Library and Information Science. He has been employed as the scientific and technical
consultant to the television series "Star Trek: The Next Generation."

Washington Update

One of the most significant computer security debates of the past decade is now coming to a close and the
outcome should bode well for the future of open computer networking. In 1984 President Reagan issued
a National Security Decision Directive, NSDD-145, that transferred authority for computer security
from a civilian agency, then the National Bureau of Standards, to the National Security Agency. A
subsequent memorandum from John Poindexter expanded the scope of NSDD-145 and created a new
justification for government secrecyÑthe protection of "sensitive but unclassified information." Civil
libertarians, library groups, the private sector, and CPSR protested the move, fearing that excessive
secrecy would diminish public access to government information. In 1987 Congress passed the
Computer Security Act to reestablish computer security authority at the National Institute for
Standards and Technology. Now in July, 1990 President Bush has revised NSDD-145 to comply with
the Computer Security Act. Though the revised document itself remains classifiedÑan unnecessary
affront to the public's right to knowÑthe Administration has made clear it's intent to follow the spirit
of the law. Computer security policy should be made in the bright light of day and not by an agency
whose existence was not even acknowledged until a decade after its creation and which operates outside
the normal channels of public accountability.

Caller IDÑDebate continues in the nation's capital on Caller ID, the controversial new service that
discloses the originating phone number of incoming telephone calls. Senator Kohl (D-WI) has
introduced legislation, similar to a bill passed last year in California, that would require phone
companies to provide a "blocking" option so that callers could choose whether to disclose their phone
numbers. Most states are requiring at least per call blocking, with Pennsylvania actually prohibiting
the service as a violation of the state constitutional right to privacy. In the District of Columbia, the
Public Service Commission narrowly allowed the offering of the service, with one of the three
commissioners arguing forcefully that the service undermines the public interest. Meanwhile U.S.
West is pursuing an innovative implementation of Caller ID that transfers a name along with the
number. If the name alone were transferred, many of the privacy problems arising from the exchange
of digital phone numbersÑwhich provide the key for telemarketing databasesÑcould be solved.

Data Protection HearingÑCPSR testified at a Senate hearing in support of legislation that would modify
the 1986 Computer Fraud and Abuse Act. CPSR expressed support for a reckless misdemeanor
provision. At the same time, CPSR raised questions about Operation Sun Devil and the long-term impact
of computer crime investigations on civil liberties safeguards. The bill soon goes to the full Senate for
consideration.

FOIA RequestÑCPSR has filed a lawsuit in Federal district court to compel the production of records
held by the FBI regarding the monitoring and surveillance of computer bulletin boards. The case is
CPSR v. FBI, 90-2096. Meanwhile, a letter from the Secret Service to Congressman Charles Schumer
indicates that the Secret Service has been monitoring bulletin boards.

Harris Privacy Poll Released--A major study on privacy was released earlier this summer. The poll
was commissioned by the Equifax Corporation and prepared by Alan Westin, a widely known privacy
scholar. The Harris Poll has been a bellweather on public attitudes about computers and privacy since
it was first conducted in 1978. Among key findings in this most recent survey: growing concern about
the misuse of personal information, awareness among corporate executives about privacy problems,
and opposition to the Caller ID service. On the relation between computers and privacy, Americans are
generally more sophisticated about privacy problems today, focusing on the use of the data and not the
existence of the computer. Finally, the survey found that a majority of Americans think the present
system of privacy protection is inadequate and that a privacy commission should be established.

Big Brother At Last?ÑThe government of Thailand has established a womb-to-tomb centralized
database to keep track of its 55 million citizens. The system will track voting patterns, domestic and
foreign travel, and social welfare. Robert Ellis Smith, the editor of the Privacy Journal, asks whether
it was appropriate that this system receive a ComputerWorld Smithsonian Award for innovative
information technology. It's a good question. Back at home, debate continues on the proposed national ID
card.

ÑMarc Rotenberg

CPSR Chapters

September 1990

CPSR/Acadiana, LA

Jim Grant
Center for Advanced Compuer Studies
University of Southwestern Louisiana
P.O. Box S/B 40103
Lafayette, LA 70504
(318) 231-5647

CPSR/Austin

Ivan M. Milman
4810 Placid Place
Austin, TX 78713
(512) 823-1588 (work)

CPSR/Berkeley

Steve Adams
3026 Shattuck, Apt. C
Berkeley, CA 94705
(415) 845-3540 (home)

CPSR/Boston

Tom Thornton
2 Newland Road
Arlington, MA 02174
(617) 643-7102 (home)

CPSR/Chicago

Don Goldhamer
528 S. Humphrey
Oak Park, IL 60304
(312) 702-7166 (work)

CPSR/Denver-Boulder

Randy Bloomfield
4222 Corriente Place
Boulder, CO 80301
(303) 938-8031 (home)

CPSR/Los Angeles

Rodney Hoffman
P.O. Box 66038
Los Angeles, CA 90066
(213) 932-1913 (home)

CPSR/Madison

Deborah Servi
128 S. Hancock St., #2
Madison, WI 53703
(608) 257-9253 (home)

CPSR/Maine

Betty Van Wyck
Adams Street
Peaks Island, ME 04108
(207) 766-2959 (home)

CPSR/Milwaukee

Sean Samis
6719 W. Moltke St.
Milwaukee, WI 53210
(414) 963-2132 (home)

CPSR/Minnesota

David J. Pogoff
6512 Belmore Lane
Edina, MN 55343-2062
(612) 933-6431 (home)

CPSR/New Haven
Larry Wright
1 Brook Hill Road
Hamden, CT 06514
(203) 248-7664 (home)

CPSR/New York

Michael Merritt
294 McMane Avenue
Berkeley Heights, NJ 07922
(201) 582-5334 (work)
(201) 464-8870 (home)

CPSR/Palo Alto

Clifford Johnson
Polya Hall
Stanford University Stanford, CA 94305
(415)723-0167 (work)

CPSR/Philadelphia

Lou Paul
314 N. 37th Street
Philadelphia, PA 19104
(215) 898-1592 (work)

CPSR/Pittsburgh

Benjamin Pierce
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213
(412) 268-3062 (work)
(412) 361-3155 (home)

CPSR/Portland

Bob Wilcox
CPSR/Portland
P O. Box 4332
Portland, OR 97208-4332
(503) 246-1540 (home)

CPSR/San Diego

John Michael McInerny
4053 Tennyson Street
San Diego, CA 92107
(619) 534- 1783 (work)
(619) 224-7441 (home)

CPSR/Santa Cruz

Alan Schlenger
419 Rigg Street
Santa Cruz, CA 95060
(408) 425-1305 (home)

CPSR/Seattle

Doug Schuler
CPSR/Seattle
P.O. Box 85481
Seattle, WA 98105
(206) 865-3226 (work)

CPSR/Washington, D.C.

David Girard
2720 Wisconsin Ave., N.W., Apt. 201
Washington, D.C. 20007
(202) 967-6220 (home)

CPSR, P.O. Box 717, Palo Alto, CA 94301(415) 322-3778

CPSR Foreign Contacts

September 1990

CPSR is in regular communication with the following individuals and organizations concerned with the
social implications of computing

Canada

Ottawa
Initiative for the Peaceful Use of Technology (INPUT)
Box 248, Station B
Ottawa, Ontario K1 P 6C4

Toronto
Dr. Calvin Gotlieb
Department of Computer Science
University of Toronto
Toronto, Ontario M5S 1A4

Vancouver
Richard S. Rosenberg
Department of Computer Science
University of British Columbia
6356 Agricultural Road
Vancouver, British Columbia V6T 1W5

Australia

Australians for Social Responsibility in Computing (ASRC)

Sydney Graham Wrightson
Department of Computer Science
Newcastle University
Newcastle, NSW 2308

New Zealand

Computer People for the Prevention of Nuclear War (CPPNW)
P.O. Box 2
Lincoln College
Canterbury

Finland

Pekka Orponen
Department of Computer Science
University of Helsinki
Tukholmankatu 2 SF-00250
Helsinki

Great Britain

Computing and Social Responsibility (CSR)

Edinburgh
Jane Hesketh
3 Buccleuch Terrace
Edinburgh EH8 9NB,
Scotland

Glasgow
Philip Wadler
Department of Computer Science
University of Glasgow
Glasgow 612 800
Scotland

Lancaster
Gordon Blair
University of Lancaster
Department of Computer Science
Bailrigg, Lancaster LA1 4YN

Sussex
Mike Sharples
University of Sussex
School of Cognitive Sciences
Falmer
Brighton, BN1 9QN

West Germany

FIFF per adresse Helga Genrich
Im Spicher Garten #3 5330
Koenigswinter 21
Federal Republic of Germany

Italy

Informatici per la Responsibilita Sociale (IRS-USPID)
Dr. Luca Simoncini
Istituto di Elaborazione dell' Informazione CNR
Via Santa Maria 46
1-56100 Pisa

Ivory Coast

Dominique Debois Centre d'Information et d'Initiative sur l'Informatique (CIII) 08 BP 135 Abidjan 08
Cote d'Ivoire, West Africa

South Africa

Philip Machanick
Computer Science Department
University of Witwatersrand
Johannesburg,
2050 Wits,
South Africa

Spain

Dr. Ramon Lopez de Mantaras
Center of Advanced Studies C.S.I.C.
17300 Blanes, Girona

Thailand

David Leon
c/o The Population Council
P.O. Box 1213
Bangkok 10112

CPSR, P.O. Box 717, Palo Alto, CA 94301 (415) 322-3778

CPSR Educational Materials

The following materials may be ordered from the CPSR National Office. All orders must be prepaid.
Please include your name and address for shipping.

Back issues of The CPSR Newsletter are available for $2 each. Some issues are available in photocopy
only.

Articles and papers

_ ANNOTATED BIBLIOGRAPHY ON COMPUTER RELIABILITY AND NUCLEAR wan. Compiled by Alan
Borning (16 pages - updated October 1984 - $2.00)

_ COMPUTER SYSTEMS RELIABILITY AND NUCLEAR WAR. Alan Borning (20 pages--February 1987 -
$2.00)

_COMPUTER UNRELIABILITY AND NUCLEAR WAR. CPSR/Madison (11 pages - June 1984 - $2.00) _

_THE RESPONSIBLE USE OF COMPUTERS; WHERE DO WE DRAW THE LINE? Christiane Floyd (4 pages -
June 1985 - $1.00)

_ THE "STAR WARS" DEFENSE WON'T COMPUTE. Jonathan Jacky (reprinted from The Atlantic, 6 pages
- June 1985 - $1.00)

_ THE STAR WARS COMPUTER SYSTEM. Greg Nelson and David Redell (10 pages - June 1985 - $1.00)

_ LIST OF ILLUSTRATIVE RISKS TO THE PUBLIC IN THE USE OF COMPUTER SYSTEMS AND RELATED
TECHNOLOGY. Compiled by Peter G. Neumann (9 pages - August 1987 - $1.00)

_ DEADLY BLOOPERS. Severo M. Ornstein (16 pages - June 1986 - $2.00)

_ LOOSE COUPLING: DOES IT MAKE THE SDI SOFTWARE TRUSTWORTHY? Severo M. Ornstein (4 pages -
October 1986 - $1.00)

RELIABILITY AND RESPONSIBILITY. Severo M. Ornstein and Lucy A. Suchman (reprinted from Abacus,
6 pages - Fall 1985- $1.00)

_ STRATEGIC COMPUTING: AN ASSESSMENT. Severo M. Ornstein, Brian C. Smith, and Lucy A. Suchman
(4 pages - June 1984 - $1.00)

_ SOFTWARE AND SDI: WHY COMMUNICATION SYSTEMS ARE NOT LIKE SDL David L. Parnas (Senate
testimony, 2 documents, 7 pages - December 1985 - $1.00)

_ WHY SOFTWARE IS UNRELIABLE David L. Parnas (8 memoranda, 17 pages - June 1985 - $2.00)

_ PRIVACY IN THE COMPUTER AGE. Ronni Rosenberg (24 pages - October 1986 - $3.00) _ SELECTED
AND ANNOTATED BIBLIOGRAPHY ON COMPUTERS AND PRIVACY. Ronni Rosenberg (7 pages - September
1986 - $1.00) _ THE LIMITS OF CORRECTNESS. Brian Cantwell Smith (21 pages - June 1985 -
$3.00)

_ETHICAL QUESTIONS AND MILITARY DOMINANCE IN NEXT GENERATION COMPUTING. Paul Smolensky
(6 pages - October 1984 - $1.00 _ STRATEGIC COMPUTING RESEARCH AND THE UNIVERSITIES. Terry
Winograd (28 pages - March 1987 - $3.00

_THE CONSTITUTION vs. THE ARMS RACE. Clifford Johnson (8 pages - December 1986 - $1.00)

_THE NATIONAL CRIME INFORMATION CENTER: A CASE STUDY. Mary Karen Dahl (4 pages - March
1988 - $1.00

_ "SENSITIVE," NOT "SECRET': A CASE STUDY. Mary Karen Dahl (4 pages - January 1988 - $1.00) _
THINKING ABOUT "AUTONOMOUS" WEAPONS. Gary Chapman (4 pages - October 1987 - $1.00)

_COMPUTERS IN THE WORKPLACE: A WORKING BIBLIOGRAPHY, CPSR Workplace Project (19 pages -
last updated March 1990 - $2.00)

Videotapes and Slide Show

Except where noted, CA residents add sales tax

Loan and rental is for one month, except by pre-arrangement

_Reliability and Risk: Computers and Nuclear War. An award winning half-hour documentary on
accidental nuclear war, the reliability of computers in critical settings, and the computer aspects of
the SDI. November 1986 [Slide show rental: $75. Videotape rental: $25. Videotape purchase, Beta or
VHS: $35, U-matic: $50. Shipping and handling: $7.00.

_"SDI: Is the Software Feasible?" Seminar sponsored by the Library of Congress for Congressional staff
members. 1 hour, April 1986. Features Danny Cohen (SDIO) and Dave Redell (CPSR) presenting
opposing views. Includes questions from the audience. [Available as a loan only. Shipping and handling:
$7.00, no sales tax]

_"To Err..." WHA Madison Public Television presentation on computer failure. Features several
members of CPSR/Madison,15 minutes, May 1985. [Available as a loan only. Shipping and handling:
$7.00, no sales tax]

_ MIT debate on the feasibility of the SDI. Co-Sponsored by the MIT Laboratory for Computer Science
and CPSR/Boston, approx. 2 1/2 hours, October 1985. Moderator: Mike Dertouzos (head of LCS); pro-
SDI: Danny Cohen and Chuck Seitz (SDIO); con-SDI: David Parnas (University of Victoria) and Joseph
Weizenbaum (MIT). [Rental: $50] Transcript also available (please call).

Books

CA residents please add sales tax

Please add $3 for postage and handling.

_COMPUTERS IN BATTLE: Will They Work? Edited by David Bellin end Gary Chapman, Harcourt Brace
Jovanovich, 224 pages,1987. An anthology of perspectives on the role of computers in the arms race,
written and edited by CPSR members. Available in bookstores or from the National Office. Cost: $14.95.

_EMPTY PROMISE: THE GROWING CASE AGAINST STAR WARS. Union of Concerned Scientists, John
Tirman, editor,

Beacon Press, 230 pages, 1986. Features chapters by eight authors. Cost: $7.95.

_ THE SACHERTORTE ALGORITHM and Other Antidotes to Computer Anxiety. John Shore, Viking Press,
256 pp., 1985. Cost: $7.95

CPSR, P.O. Box 717, Palo Alto, CA 94301 (415) 322-3778


How Balance of the Planet Was Created
Chris Crawford

When I was first asked to create a game about environmental issues I refused, thinking that
environmental issues are simply not appropriate material for a computer game. In the first place,
there is no intrinsic conflict between active agents in environmental problems; such conflict is
necessary for a good game, just as with a story. Second, environmental problems aren't intrinsically
fun; you'll never see a Hollywood comedy with Dudley Moore and Carol Burnett dancing in the acid rain.
Third, there is no "key element," no central factor to which all environmental issues can be reduced.
Despite all attempts at simplification, environmental issues refuse to be boiled down to some single,
central problem. They remain stubbornly diffuse, with hundreds of interrelated factors, none of which
can be ignored.

It was this third problem that bothered me the most, and it ultimately provided the solution. I decided to
make the diffuse nature of environmental problems the fundamental structure of my game. The game
would be a "hypertexty" representation of a great many environmental factors, all connected by cause
and effect. Thus, coal use is a factor hat effects such things as sulfur dioxide in the air, nitrous dioxides
in the air, strip mining, and carbon dioxide, while it in turn is affected by the supply of coal, energy
demand, and the price of coal.

This basic structure is very flexible. Almost every environmental problem could be expressed as a
factor in the hypertexty network. Some had to be broken down into multiple subfactors, and a few
clumsy intermediate factors had to be fabricated.

Perhaps the most striking feature of the game is the fact that it presents the equations used in the
simulation to the player. A single menu item brings up the equation used, the values of the terms in the
equation, and scroll bars to adjust the coefficients.

Presenting the equations imposed severe constraints on the design. First, it precluded any advanced
functions or operators. For the most part, I had to make do with the four arithmetic functions. I used a
few square root functions and a few logs, but that was the extent of my higher math. I didn't want to
disenfranchise too much of my audience.

Another constraint arising from the presentation of the equations arose from the requirement that it fit
on a single line on the screen; I didn't want to befuddle my audience with multi-line formulae. This
formed a severe constraint when coupled with my insistence that each variable name be spelled out in
detail.

Despite these constraints, the presentation of the equations added a great deal to the game, for it brings
the player right into the guts of the simulation. Its greatest value arises from its neat handling of a
problem, that bedevils any policy simulation designer: the fact that the designer's biases are
unavoidably insinuated into the design. By presenting the equations, we invite the player to examine
those biases directly. More important, we permit the player to modify the biases in two ways. First,
the player can load a "bias file," a set of numbers that reflect a consistent point of view. The game
offers four such bias files: pro-nuclear, environmentalist, industrialist, and Third World.

The player can also declare his own biases by changing the coefficients in the equations. This can reflect
his own technical judgments, as with the safety of nuclear power plants, or it can be much more
personal. The point system used in the game is itself subject to user modification. This permits a player
to determine how many points he should lose for each death attributable to, say, pollution- induced lung
cancer. He can also set the point loss for each species made extinct; this brings into stark contrast the
value of a human life compared with the value of biodiversity. An even more troubling challenge is
provided by the point assignment for each starvation-related deathÑinasmuch as starvation's main
victims are poor, foreign, and dark-skinned, the simulation asks whether their lives are worth as
much.

These techniques yield a game that honestly addresses some serious environmental issues. I was able to
preserve both the intellectual integrity of the product and its commercial viability. The game has not
been a smash hit; the marketplace's response to it has been, so far, lukewarm. An audience accustomed
to fast-paced action, loud graphics, and gratuitous violence has trouble accepting a product whose
primary asset is the moral challenges it offers. But we are finding that a goodly number of people who
don't normally play computer games are very pleased with this one. That's good enough for me.

Chris Crawford is an independent game designer whose previous products include the game/simulation
"Balance of Power, " which was reviewed in the Summer 1989 issue of The CPSR Newsletter.


EARTHQUEST by Bob Stevens and David Smith
Software Review by Robert E. Beuthel

EARTHQUEST TM is a new Macintosh HyperCard information explorer program, created to help prepare
young people live in a global world.

It looks at our planet in four major categories. The first of these is the physical earth itself, focusing
upon the planet and its systems of land, water, air and life. Next is the human journey, its timeline,
history, nations, progress, and creativity. Third is the environment and the ecosystem, and last, a
world tour through the six continents of North America, South America, Europe, Africa, Asia and
Oceania.

Each category is comprised of many subcategories and related topics. From the startup Explore card,
getting into the program is accomplished by clicking on an icon from one of the four categories.
Travelling within the program one can go directly to a topic through an index button at the bottom of
each card, or by using the navigation corner (upper right hand portion of screen) go to the Explore
card or move to similar cards using the arrows. Help, which is available everywhere in the program,
explains how to move around in the program and provides a ready reference for various unique features
such as movies, quotes, sounds and labels. I found it took me some time to get the knack of moving
through the program, but that was not a problem as, at every turn, I discovered new and informative
material.

The "navigator icon" on most world map screens is an excellent example of the type of extraordinary
feature which captures one's interest while providing information. Clicking on this icon provides a
window in the upper right hand corner showing the longitude and latitude of an airplane which can be
moved to any location on the map as well as the approximate distance from a predetermined "home" city
as well as the meridian time.

Many screens have "movies." In the Water Cycle screen, one can see the moisture forming over the bay,
turning to clouds which gather over the mountains producing lightning and rain, then ending in flowing
water in the sink and shower. In other situations the movie reinforces a concept such as erosion, an
exploding volcano or how Newton discovered gravity. The 20 movies are not always easily found,
making their discovery both exciting and challenging.

At times, I was frustrated when I would click on an object and nothing happened, but I found that for
those who are curious or who don't want to miss anything, one can seethe buttons or "hot spots" by
depressing the Option and Control keys simultaneously.

I found EARTHQUEST TM to be more than a database of information. It is a creativity tool promoting
interaction between the user and the program. The program's graphs, quotation list, maps and graphics
can be pasted into reports, overhead transparencies, or term papers. It invites users to make their own
additions. In the beta version I tested, there were problems in creating one's own stack to add related
data, but I understand that this problem is being corrected. Several teachers who have seen the program
have called it ``mind boggling" because of the wealth of information it contained and its use of
HyperCard features.

What I found most exciting were the possibilities to extend and expand the basic program. The
developer's prototypes of history and living systems are especially exciting as a vision of what full use
of EARTHQUEST TM could mean for education. This program's potential is limited only by one's
imagination, the accessibility of a hard disk to fully utilize all the features of a 3-plus megabyte
program and the additional memory needed to build one's own related cards. It can be used as a tool to
foster discussions about the Earth and its environment, as a base of information for research, and as a
fun way to explore the many interrelated dimensions of our planet.

Bob Beuthel is Superintendent of Schools in Burlingame, California.

The Ecodisc CD-ROM David Riddle

Ecodisc is a HyperCard-driven simulation of a nature reserve, the Slapton Ley Reserve in England's
South Devon. It is the user's role here either to simply study the Reserve and its flora and fauna and
possibly carry out standard biological sampling procedures, or to take the place of a Trainee Reserve
Manager and to formulate a plan for the future of the reserve based upon the information gleaned from a
study of the location and the views of a range of local interest groups and individuals.

Description

Ecodisc has several sections: Walk, Future, Sample, Expert, and Plan. There is also an Introduction
sequence.

Arranged at the left of each card throughout the HyperCard stack are Knapsack, Explain and Help (?)
buttons. Knapsack enables users to construct a collection of the Ecodisc material in their own way, such
that simple desktop presentations can be created to use in the individual's own project work. Explain
provides "inside" information on the simulation at that precise point (i.e., it is card-specific)
including links to additional textual and graphic references and to a Tutor facility. Help accesses an on-
line Help stack that provides basic information on HyperCard use and stack navigation.

The Introduction provides a brief description of what the user can do with Ecodisc. It consists of a series
of stills. A sound commentary is also provided as the sequence of stills is advanced. Control of this
sequence is by means of simulated tape recorder-style play, single frame advance, single frame
reverse and restart sequence buttons.

Walk

Walk allows the user to select a location from an annotated map of the Reserve with clickable "View"
points (marked as "radio" buttons) and then to look in eight different directions from those points, in
summer and in winter.

Each point on the map also has a picture taken vertically downwards from a height of 1,000 meters by
a camera mounted in a helicopter. In addition there are sound effects appropriate to the particular View
point selected, e.g., cars, helicopter rotors, birds singing, people, or silence.

Expert

Expert allows the user to hear about the studies of experts in various fields associated with the flora
and fauna of the Reserve. These include the woodland manager, an ornithologist, a mammal expert, an
underwater scientist and an expert on the ecology of reed beds.

Future

Future allows the user to investigate the projected species population trends in five-year increments
for up to fifty years into the future for the each of the Ley areas covered by the Ecodisc package. Species
numbers are presented as histograms, and the eleven resulting graphs can either be "run" forward,
backward or stepped through in sequence under the control of more tape recorder-style buttons.

Sample

The Sample facility allows users to simulate real world field-study exercises of identifying and
counting in five areas of the reserve: woods, reeds, fish, birds, and mammals. For example, on the Fish
sample card a reserve map is shown and netting points are chosen by clicking in the Ley area. As many
as five netting sites can be set, or just one. Clicking on the net site again reveals the "catch." The fish
are represented by a fixed size, but randomized, graphic. By clicking on each fish in the "net," a count
is established in a table. Duplication is not prevented. The netted fish are then "spray-marked" and
returned. On re-netting, a proportion of marked fish appear amongst the unmarked. Both are click-
counted, and a model in the system then estimates the total population based on these results.

Plan

Clicking the Plan option on the Main Menu takes the user to the Reserve Manager's office which contains
a computer (for looking at the various population trends with the environment as it is "now," and as a
result of submitting a plan for the Reserve), a TV, a newspaper and an in-tray full of letters.

Initially, clicking on the TV display provides spoken instructions on what the user has to do as a Trainee
Manager. There is also a button here that can take the user to the Decisions Card.

Clicking the newspaper initially reveals a randomized selection of local newspaper clippings
expressing the views of various interest groups such as the local sailing club, the fishermen, tourists,
reed cutters, forestry owners, and so on. The in-tray, similarly, provides a selection of letters from
individuals attempting to influence the manager's decision as to the future for the reserve.

Clicking the Make Plan button allows the user to make a plan for the future of the reserve.

After making a plan and returning to the Manager's office, clicking the TV display provides a statement
of the consequences of the user's plan. Another look at the computer at this point will provide a modified
projection of the reserve's future over the next 50 years based on the plan. Similarly, a fresh look at
the newspaper and the in-tray will provide the revised (and not always favorable) comments from
those local interested parties. The user may modify decisions as a result. There is no correct answer!

Other Features of Ecodisc

A section of the disc has been allocated to public information articles of a supporting nature such as the
Countryside Code, NCC Pamphlets, studies in biology, details of sampling methods, etc. Links to this
additional material have been made on the basis of the content relevance to certain cards. There is also a
useful general biological glossary that includes most of the terms encountered in the reference material
and elsewhere on Ecodisc.

To aid the use of Ecodisc CD-ROM by teachers who are not subject specialists in biology, some content-
specific questions have been posed in the system to facilitate an element of self-learning. For example:
Here is an algal "bloom." What has caused this? What can be done about it? Suggestions are also
provided as to directions in which tutors might look for the answers to such questions.

One of the key aspects of the CD-ROM is the fact that the spoken sequences and the main text fields are
offered in nine European languages: Belgian, Danish, Dutch, French, German, Italian, Norwegian,
Spanish, and Swedish. This opens up a large number of opportunities in the area of modem language
teaching and for English as a foreign language, with access to material of a more technical aspect than
would normally be found in the average text book.

In addition to the basic Ecodisc CD-ROM product, Apple (UK)'s Schools Marketing Department has
sponsored the creation of five sets of learning support materials for Ecodisc that come under the very
general headings of Community Politics, Biology, Journalism, Mathematics and Primary Education.
Educational authors from these fields have put together a vast range of ideas in the form of worksheets,
model-making activities, Pagemaker templates, HyperCard stacks, spreadsheet templates, and role-
playing exercises. The concept was to provide materials for practicing teachers to take and use, or
modify for their own specific situations. Not all the material relates directly to the precise content of
Ecodisc, but builds on the environment that is presented in the Nature Reserve, and starts to address
questions of 'what if?' and "wouldn't it have been nice if Ecodisc had...?" A complete set of these support
materials is shortly to become available either as part of, or as an optional extra to, the finished
commercial product that represents Ecodisc CD-ROM.

Although not perfect in its execution, as it stands, Ecodisc offers one of the better examples of the
possibilities opened up by our brave new world of computer-controlled multimedia. What we have here
is a credit to all involved, and should provide plenty of ideas with respect to both what one should, and
should not, do in multimedia productions.

The Ecodisc CD-ROM is available in the U.S. from EDUCORP, 531 Stevens Ave Suite B, Solana Beach CA

92075; telephone: (619) 259-0255. It is available in the U.K. from ESM, Duke Street, Wisbech,
Cambs PE13 2AE, telephone: 011 -945-63441. A demonstration version of Ecodisc has recently
appeared on one of the CD-ROMs associated with the Apple Encyclopedia of Multimedia.

David Riddle is the software and hardware manager of the Educational Computing Unit at King's College
in London. He is also the Apple University Consortium Coordinator for the United Kingdom.

More Software on the Environment

As well as the software reviewed in this issue of the newsletter, there are other software packages on
environmental issues either under development or already available. Among these packages are:

The Environment Series--Interactive tutorials and simulations on air pollution, water scarcity,
waste, endangered species, energy efficiency, global warming. For Apple IIs, Macintosh, IBM PC.
Contact Queue Inc., 338 Commerce Drive, Fairfield CT 06430.

Global Warming--A Hypercard stack for Apple Macintosh about the effects of global warming. Available
on disk from EcoNet, 3228 Sacramento Street, San Francisco CA 94115.

SimEarth: The Gaia Concept--By Will Wright, author of SimCity. Based on the idea (the Gaia concept)
that the earth is a self-regulating organism. User has to create and nurture an intelligent species
capable of seeding other planets.

Many computer aided environmental education programs are described and evaluated in a monograph
from the North American Associate for Environmental Education. Articles describe computer-aided
field studies, hypermedia, telecommunications, software, videodiscs, and electronic field trips. The
monograph is edited by W. J. "Rocky" Rohwedder, Associate Professor of Environmental Studies and
Planning at Sonoma State University. Contact North American Association for Environmental Education,
P.O. Box 400, Troy OH 45373; telephone: (513) 698 6493.

Ecological and Environmental Risk Analysis

Using Computers to Estimate Impacts and Uncertainties
Scott Ferson

Several government agencies (including the Environmental Protection Agency, the U.S. Fish and
Wildlife Service and many other federal as well as state agencies) find they have to make estimates
about the impact that human activities will have on the ecological fabric of the world. The need for these
estimates is born of a concern for the health of humans or arises out of stewardship for the natural
environment and its biota. Of course industry must also make these estimates since it is often industry
whose behavior is regulated (and whose costs are shifted) because of decisions based on these estimates.
This de facto adversarial system has been developing since the National Environmental Policy Act of
1969, but it has resulted, until the last few years, in surprisingly little methodological progress on
techniques that predict the risks and uncertainties of environmental impacts.

Ecological and environmental risk analyses are loosely organized disciplines involving studies that
range through the fields of hydrology, geology, air quality, toxicology, epidemiology, public health,
demography and ecology. "Environmental" impacts are adverse consequences that people suffer from
introduced hazards in the environment. In this case concern is generally for public and individual
health: if people are sick or dying because of (or even just exposed to) some environmental hazard, it is
usually a matter of considerable excitement, even if the number of people affected is a very small
percentage of the population. "Ecological" impacts are adverse consequences on natural ecosystems and
non-human species. Concern is typically limited to the population-level effects on animals and
vegetation. The death of a few fish or birds is not considered significant until it becomes so widespread
as to endanger the vitality of the entire population. Despite these differences in focus, there is much
overlap in the kinds of questions that are asked in human and non-human studies because of the
interdisciplinary nature of the problem. For instance, toxicological data collected from dose-response
studies on individual animals can be used to predict effects on a population's mortality rates.
Conversely, epidemiological data can sometimes give clues about the avenue of exposure at the level of
an individual.

As with any developing science, the fields retain certain problematic notions and unsophisticated
methods. For instance, many researchers (Barnthouse and Suter 1986, infer alia) have criticized the
use of arbitrary quotients and "safe levels" that have little biological meaning, questionable statistical
characteristics, and no recognition of the variation in the insult or the natural processes that influence
it. Also widely recognized is the problem of analytical dependence on overly simple models of physical
and biological systems. Many of the natural processes that govern the likelihood and severity of an
impact are extremely complex and their representation is simply beyond the state of the art in
ecological modeling. Yet the estimation of impacts usually requires that some model of these processes
be used, so there is a danger that the resulting estimates will be misleading if the model is a poor image
of nature. Another problematic methodology involves the use of diagnostic species as representatives
for broad classes of organisms, and indicator species as sentinels of the health of an ecosystem.
Pollution sensitivity varies by orders of magnitude among species that are supposedly represented by
diagnostic species. There is no substantive ecological theory behind the choices of exemplar species nor,
in fact, behind the idea that indicator species can be used to diagnose the health of entire ecological
systems in the face of multiple insults.

In the section below we consider the hyperconservatism of risk assessments, which is perhaps the most
widely criticized methodological aspect of environmental risk analyses.

Worst Case Analysis

An often used approach to measuring an environmental or health impact is to estimate its upper bound
in what is called a "worst case" analysis. This is done by using conservative estimates at all steps in a
sequence of calculations leading to the prediction of impact. For instance, to assess the cancer rates due
to a toxic pollutant, one has to combine several intermediate estimates to yield a final evaluation. One
needs to know how much of the toxic substance will be released into the environment, how much will
find its way (via water or air transport or via bioaccumulation through the foodchain), how frequent
and prolonged these exposures will be, and how sensitive humans are to the substance. Similarly
complex calculations are required to estimate adverse impacts of pollution on animals and vegetation,
involving for instance, how many organisms will be killed each year, how much the reproductive
capacity of those that remain will be reduced, and influences of other random environmental factors on
the population's dynamics. Of course, none of these estimates is certain. Variation in weather patterns,
for example, can influence how pollutants are transported. Accidental spills and even normal industrial
output can be hard to foretell. And, of course, the sensitivity of humans and natural organisms depends
on many factors such as age, genetic predisposition and the happenstances of exposure. Thus none of the
estimates used to predict the risk of an impact is certain. In a worst case analysis, the computations are
made using extreme values representing the worst circumstances likely to occur. Thus, one would use
an upper bound of estimates for how much pollution will be produced, high values for frequency and
duration of exposures, low estimates for any possible mitigating or compensatory effects, and toxicity
projections from the study showing the greatest effect.

At face value, the strategy of worst case analysis is a very sensible approach; if it is applied
consistently, it produces estimates that are conservative, that is, that are likely to be larger than what
the final impact will actually be. One can be fairly sure at least that the actual impact will not be any
more extreme than this value. If, on the other hand, we were to try to balance conservative and
optimistic choices in the cascade of computations, we would have no idea whether the final estimate was
high or low. The worst case approach has been and still is widely used. Many important studies (and
certainly the most dramatic studies) generated by regulatory agencies have been worst case analyses
based on laboratory organisms or diagnostic species that in essence argue about how bad an impact
might be in the natural environment.

Naturally, these arguments are often easily criticized as unrealistic. One regularly hears responses of
the form "one would have to have a hundred exposures a day for a hundred years to experience the
effect." This argument, while belittling, misses the point since the analysis was designed to be
conservative in this way. However there are two serious criticisms of the approach. Finkel (1989,
1990, Roberts, 1990) among others pointed out that these assessments are fragile in the sense of
having a false precision. If the studies and estimation process were independently repeated, the
resulting assessment might be quite different. Furthermore this worst case approach obviously uses
very little of the information that can be available to make its prediction. More moderate observations
and negative findings are simply ignored. This means that the predictions will be quite sensitive to any
new information, even if from a study with a fairly small sample size, that would suggest an effect to be
more extreme.

Just as serious a criticism of the worst case approach is that it is misleading. A projection that suggests
an impact is extremely serious when it is not creates undue alarm, diverts public and scientific
attention, wastes limited resources and weakens the credibility of the agency. It is difficult to make
rational decisions based on these analyses (cf. Zeckhauser and Viscusi 1990). What might be more
useful than a description of the worst scenario possible, would be a prediction of the scenario that is
most likely. Unfortunately the magnitude of the likely impact is not guaranteed to be closely related to
that of the worst case impact; they can often differ by a factor of several hundred. A ``best estimate"
approach in which the best statistical estimates (the means or modes) are used in the projection
instead of the most extreme possible values would perhaps give a better understanding of the likely
impact, but a best estimate approach is still fragile because it does not recognize the intrinsic and
unremovable variation that characterize the natural processes involved.

Probabilistic Projection

A straightforward solution is to use the entire statistical distributions of the variables involved in the
prediction calculations, rather than just their means or modes or extreme values. If enough
information is available about the distributions, this solution can correctly propagate measured
parameters along with their associated uncertainties to yield a statistical estimate of the impact. The
estimate is in the form of a frequency distribution of impact severity that in a statistical sense
specifies the likely magnitude of the impact as well as the variation that may be expected given the
uncertainty of the underlying processes. There are a few different methods that can be used for this
projection, most of which are computer-intensive techniques. These include direct procedures for
propagation of statistical error such as the delta method or probabilistic convolution as well as brute
force measures such as Monte Carlo replications using parameters selected randomly from the observed
statistical distributions. An approach embracing probabilistic projection offers a much more detailed
picture of what the future impact may be. It includes the worst case and best estimate projections as
well, although it couches them in their proper contexts as the end and middle of the full distribution.
The method can be further enriched by following a weight-of-evidence protocol that includes all
relevant data, including studies showing no effect, and weighting them appropriately by sample size or
other measure of credibility. This approach has been dubbed "information analysis" by Sielken (1989)
since it attempts to use all the information available to make the best and fullest prediction possible.

One needs to be careful about probabilistic projections, however. It is not the case that they will
necessarily yield true images of the central tendency of an impact and uncertainty around it. For
instance, if the route of exposure of a chemical influences its toxicity, then the estimates of exposure
and toxicity will not be independent and ordinary methods of probabilistic projection will be incorrect.
Temporal autocorrelation can also seriously impair the method's accuracy. In particular, it may be the
case that the projections can be overly optimistic and underestimate the severity of an impact.
Obviously this could be a more serious matter than being overly conservative. There are ways to deal
with autocorrelation and cross correlations among the variables in the computations, but to be used
effectively they typically require considerably more careful studies and a great deal more raw data
which is most often unavailable. It may make more sense to use a completely different formalism for
projection of uncertainty that does not have the same problems. Recently described possibility theory
(e.g., Dubois and Prade 1988) offers methods that are intermediate in conservatism between simple
probabilistic projection and worst case analysis.

Ease of Misuse

Like most quantitative judgements nowadays, the predictions of environmental or ecological risk
assessments are made with the use of a computer. In fact the most advanced techniques are possible only
on a computer. The methods now available are a considerable improvement over the "wet-finger-in-
the-wind" approaches that characterized the early days of impact assessments, but whatever advantages
reliance on computers has, it certainly moves the assessment process further away from the biological
intuition and common sense from which it originated. There is a hazard in this that is worth addressing:
increasing ease of use also increases the ease of misuse. As a conveniently generic example of how ease
of using computers can foster the misapplication of a technology, consider the experience of statistics.
It is an especially appropriate example because of the increasing importance of statistical methods in
ecological and environmental risk analysis.

The development and general availability of computer software for complex statistical analyses have
allowed a variety of sophisticated techniques to become much more widely used than ever before. Any
user of SAS for instance can very quickly find out how to do a discriminant function analysis by looking
in the reference manual; it is considerably more difficult to find out when one should use this analysis.
It is now remarkably simple to crank a data set through an analysis without really being sure of the
appropriateness of the method. Of course this has always been possible, but in the past the barrier of
sheer difficulty was often sufficient to convince users to spend some time to ensure that an analysis was
worth doing. No such barrier exists anymore. While no one doubts the overall advantages of making
powerful statistical packages widely accessible and easy to use, there are a frightening number of users
who come away from the line printer or personal computer with the "answer" to a question that was
never properly posed.

While prudence restrains one from estimating how much such misuse goes on, it seems that the very
possibility of it would encourage software designers to rise to meet the challenge. What ways can
software be made more conducive to responsible use? (An example from statistics of the wrong thing to
do: Around the time that VisiCalc was first introduced, a famous applied statistician described software
he was developing that would display the data set and the test statistic side by side and allow a user to
modify a data value and instantaneously see the effect of the change on the test statistic. This design so
clearly invites misuse that I refer to it as "Visi-finagle".) There are many strategies that would
improve software against misuse without diminishing its ease of use. Printing out test assumptions
along with results, suggesting or even automatically performing checks of data appropriateness,
displaying references to relevant literature, reporting inconsistencies or possible errors of usage are
a few tactics that will help a motivated user avoid many pitfalls.

As the methods employed by ecologists and environmentalists become progressively more entwined with
the computer, and further removed from their real interests of organisms and the natural world, it is
essential that the quest for ease of use be accompanied by programmed defense against
misunderstanding. For instance, it will in general be difficult for a user to distinguish between cases
where ordinary probabilistic projection can be safely used to assess impacts and cases where a more
careful approach is warranted. Well designed software can be crucially important in these
circumstances.

Conclusions

The field of environmental impact assessment includes a host of procedures for understanding and
predicting the consequences of pollution, resource use, and other kinds of environmental perturbation
due to human activities. Environmental scientists at many regulatory agencies tend to produce worst
case analyses that argue about how bad an impact might be. These conservative arguments are often
easily criticized as unrealistic and any proposed regulation based on them as overzealous. New methods
have been proposed that could be used to improve the analyses to produce fuller, probabilistic
predictions about what the effects are likely to be as well as what uncertainty the predictions have.
However, in some circumstances the methods may produce predictions that are too optimistic,
predicting impacts smaller than they actually will be.

The growing sophistication of environmental and ecological risk analysis has been nurtured by
increasing reliance on computer-intensive methods. Many new methodologies that were developed
outside of the discipline will soon be widely used in important applications. This suggests a need for
software designers to create programs that consciously discourage misuses that can arise out of
unfamiliarity or misunderstanding of the methods or ignorance about the data. One valuable function of
such defensive software is guarding against overly optimistic impact predictions that falsely minimize
the actual risks.

Acknowledgments

This article benefitted from my discussions with Thomas F. Long of the Illinois Department of Public
Health, and Lev Ginzburg and F. James Rohlf of the State University of New York at Stony Brook.

References

Barnthouse, L.W. and G.W. Suter II. 1986. User's Manual for Ecological Risk Assessment. ORNL-6251.
Oak Ridge National Laboratory, Oak Ridge, Tennessee.

Dubois, D. and H. Prade. 1988. Possibility Theory: An Approach to Computerized Processing of
Uncertainty. Plenum Press, New York.

Finkel, A. 1989. "Is risk assessment really too conservative: revising the revisionists, " Columbia
Journal of Environmental Law 14(2) :427-467.

Finkel, A. 1990. Confronting Uncertainty in Risk Management. Center for Risk Management, Resources
for the Future, Washington, D.C.

Norton, S., M. McVey, J. Colt, J. Durda and R. Hegner. 1988. Review of Ecological Risk Assessment
Methods. EPA/230-1088-041. Environmental Protection Agency, Office of Policy Analysis,
Washington, D.C.

Roberts, L. 1990. "Risk assessors taken to task," Science 248:1173.

Sielken, R.L., Jr. 1989. "Pitfalls of needlessly conservative aspects of cancer risk quantifications and
avoiding them." Presentation in the session "Has risk assessment become too conservative?" of the AAAS
annual meeting, January 1989.

Zeckhauser, R.J. and W.K. Viscusi. 1990. "Risk within reason," Science 248:559-564.

Scott Ferson is a senior research associate with the company Applied Biomathematics, the author of
RAMAS and coauthor of the book Models for Species Conservation (forthcoming).

Environmental Computer Networks

Several computer networks are devoted to the discussion of environmental matters. Among them:

EarthNetÑIncludes e-mail, teleconferencing, databases. Contact EarthNet, P. O. Box 330072, Kahului,
Maui HI 96733; telephone: (808) 872-6090. Or contact Pegasus Networks, P. O. Box 201, Byron Bay
NSW 2481, Australia; telephone: 66-85-7286.

EcoNetÑIncludes e-mail, teleconferencing, databases. Contact EcoNet, 3228 Sacramento St, San
Francisco CA 94115; telephone: (415) 923-0900.

EnviroNetÑIncludes e-mail, teleconferencing. Topics include energy, forests, "stepping lightly," and
toxics. Run by Greenpeace. Contact Greenpeace Action, Bldg E, Fort Mason CA 94123; telephone: (41 5)
474-6767.

GreenNetÑContact GreenNet, 26 Underwood Street, London NW1 7JQ, UK; telephone: 1-490-1510.

International Network on Environmental Policy (INEP)-- Intended to enable policymakers from
around the world to communicate on environmental law-making. Includes a listing of organizations
providing information on environmental problems, adaptation of existing databases and development of
new databases on environmental matters, provision of current environmentally-relevant news events
and analysis, and facilities for on-line conferences and electronic mail. For information contact
Reference Point, 1100 Trafalgar Street, Teaneck NJ 07666 USA; telephone: (201) 836-9152; or
John Harris, INEP Project Coordinator, (301) 596-2740.

Kids NetworkÑE-mail facilities for school students. Topics include acid rain, water, weather, and
waste. Run by the National Geographic Society. Contact National Geographic Society, Educational
Services Dept. 1001, Washington DC 2007; telephone: (800) 368-2728.

Network EarthÑPartly a television show on the Turner Broadcasting System and partly a service from
CompuServe. CompuServe service allows viewers of the program to communicate with one another,
with environmental experts and organizations, and the staff of television show. For information contact
Turner Broadcasting at (213) 558-7455 or CompuServe at (614) 457-8600.

Notes from the CPSR Board Eric RobertsÑCPSR President

As of the beginning of the new CPSR fiscal year on July 1, there are some new faces on the CPSR Board
of Directors, and several continuing members have taken on new roles. In the spring elections, I was
elected to be the new CPSR President, taking over from Terry Winograd who will remain on the board
as a Special Director. Steve Adams, who has served for the last year as the chair of CPSR/Berkeley, was
elected to be the new National Secretary. Karen Wieckert (CPSR/Los Angeles) and Patti Lowe
(CPSR/Milwaukee) were elected to be the directors of the Western and Midwestern regions,
respectively. I am very pleased to welcome the new members to the board, and also want to thank
departing board members Lucy Suchman and Hank Bromley for all of the work that they dedicated to
CPSR during their years on the board.

For those of us who are involved in the national CPSR program, this is certainly a very exciting time.
In July, we received the largest grant in CPSR's historyÑa two-year grant of $275,000 from the
Electronic Frontier Foundation to expand our program in civil liberties. Beyond this, we have several
important opportunities to strengthen our program in other areas, such as computers in the
workplace, computers and the environment, computers in education, and the direction of computing
research into the next century. Our programÑin terms of staff, in terms of financial support, in terms
of its impact on public policyÑ is larger than ever before.

And yet, we still face some significant challenges. For one thing, many of our local chapters are facing
difficult times. In the early days of CPSR, our chapters took on extremely ambitious projects such as
the SDI debates, the Reliability and Risk slide show, plus any of a number of working papers,
editorials, and letters about issues central to CPSR's concern. Despite our growth in recent years and
the increase in the number of CPSR chapters nationwide, grassroots activity has fallen off.

In part, I think this change has occurred because of our success as a national organization. We now have
two offices, in Palo Alto and Washington, and an extremely talented program staff that has focused on
such critical program work as our continued battle against the Strategic Defense Initiative in its new
incarnation as "Brilliant Pebbles" and our activities to ensure that privacy and civil liberties are
protected in this electronic age. While this work is exciting and has helped to establish CPSR as a major
player in public policy debates involving the application of computer technology, we have unfortunately
allowed some of our chapter support activities to slide over that same period. Materials have stopped
flowing with any regularity, and funds have been held up as we struggle with our persistent budgetary
problems.

I believe that this erosion of our grassroots strength has led to a lessening of "CPSR spirit" at the
chapter level, and that some members no longer feel that they have a central place in the organizationÑ
that CPSR is theirs. This loss of a sense of ownership makes it more difficult to sustain existing
program activity, much less to promote new activity, and further makes it harder for us to rely on
chapter-based outreach for membership and fundraising development.

As a first step toward trying to reverse this trend, I spent much of the month of August visiting CPSR
chapters in the Midwest and Middle Atlantic regions, along with several cities in the Midwest and South
where CPSR has a concentration of members but no formal chapter as yet. My goals for this trip were
to make a personal connection between the national organization and the chapters; to find out what
members feel they need to support their activities and what the major areas of interest are; to provide
members with a better sense of what CPSR has been doing in the last few years and of the exciting
opportunities we have for the future; and to restore a collective sense of ownership in CPSR as an
organization. In the coming year, I hope to visit chapters in other parts of the country.

We also face the challenge of redefining the role of CPSR so that our program takes account of the many
changes the world has witnessed in the last few years. Although we must continue to speak out against
high-tech weapons systems that will not work and to raise public awareness about the risk of software
failure in military systems, such issues will no longer be the overriding focus of CPSR's attention. Our
programs in civil liberties and the workplace have had considerable success in recent years and
represent issues of critical, and growing, concern. We must continue to expand these programs and to
develop new areas of CPSR activity.

To undertake these new initiatives, however, we will need to resolve the long-standing financial
problems that CPSR has faced over the last few years. New program initiatives that attract significant
foundation support, such as the EFF grant earlier this summer, strengthen the organization
considerably. At the same time, these resources are dedicated to specific program activities, and we
must depend on membership support for the day-to-day operations of the organizationÑto put out the
newsletter, to attract new members, to provide seed money to chapters seeking to initiate new projects,
and all of the other work we must do to ensure that the CPSR message is heard. Your support has made
possible our past accomplishments. Your support is also the key to our future success.

CPSR Membership Drive Contest

CATEGORIES

Here's the challenge. To motivate you to talk to your friends and colleagues about joining CPSR, we've
decided to have a tasteful contest and offer some prizes as incentive. We will award prizes in each of the
following categories:

1. The individual member who brings the most new members in.

2. A random drawing in which you are entered once for each new member you bring in.

3. A random drawing from all new members referred in by an existing member.

4. The chapter that has the greatest percentage increase in new members.

In case of a tie in the individual and chapter categories, we'll draw straws for who gets to pick their
prize first.

The more members you bring in, the greater your chances of winning! In fact, even if your efforts
double our membership, you have a better than 1 :2500 chance to win. Great odds.

The chapter prize goes to the chapter with the greatest percentage growth in number of members
between 8/31/ 90 and 12/31/90. This gives smaller chapters more incentive to compete.

ELIGIBILITY

Membership applications received by the CPSR National Office postmarked (or delivered) between
September 1, 1990 and December 31, 1990 are eligible for the drawing when accompanied by an 3 x
5" card (or equivalent) with the following information: new member name, referring member name,
and name of the chapter (if any) the new member is joining. One cannot refer oneself. Membership
applications accompanied by the relevant information are acceptable for the contest at the discretion
and judgment of the national office staff (legibility is the primary criteria). The National Office will
also be mailing special membership forms to all members so that these forms can be used to bring in
new members and enter the contest at the same time.

Board members, Executive Committee members, and employees of CPSR are not eligible for this
contest. Winners must be current members of CPSR.

Winners and prizes will be announced in the Winter, 1991 CPSR Newsletter.

PRIZES

The prizes, listed below, are going to be handled as a pool of prizes. The winner(s) in category #1 get
the first pick of prizes, then the winners of the random drawings, and finally the winning chapter(s).
In case of a tie in the individual and chapter categories, we'll draw straws for who gets to pick their
prize first. Additional prizes and categories may be added at the discretion (or whim) of the contest
committee. All prizes were either donated or bought with funds donated specifically for this contest.
New prizes are being added to the contest as new donations are offered.

1. A Toshiba T-1000 laptop computer, with 512K RAM, MSDOS (or equivalent, subject to
availability)

2. An Apple modem, 1200 baud 3. A year subscription to one of the following:

Release 1.0, Esther Dyson; PC Letter, Stuart Alsop; MacWeek; MacWorld; MacUser; BYTE Magazine;
InfoWorld; PC Computing; PC World; PeaceNet (connect charges not included); The Well (connect
charges not included) 4. A choice of any ten of the following books and computer software:

Al Handbook Vols. 1-4, Barr and Feigenbaum Binding Time, Halperin

Computer Graphics: Principles and Practice, Second Ed., Foley, van Dam, Feiner, Hughes

Computers in Battle, Bellin and Chapman

Creating User Interfaces by Demonstration, Myers Cryptography and Data Security, Denning

Designing the User Interface, Shneiderman

Directions and Implications of Advanced Computing (Vol 1), Schuler and Jacky

Hypertext Hands-On, Shneiderman Method and Tactics in Cognitive Science, Kintsch, Miller, Polson

Proceedings of CPSR Conference on Directions and Implications of Advanced Computing, 1988
(DIAC'88)

Proceedings of CPSR Conference on Directions and Implications of Advanced Computing, 1989
(DIAC'89)

Proceedings of CPSR Conference on Participatory Design of Computer Systems, 1990

The Psychology of Everyday Things, Norman Rogue Programs: Viruses, Worms, and Trojan Horses,
Hoffman Socializing the Human-Computer Environment, Vaske and Grantham

Software Maintenance: The Small Systems Management Guide, Bellin The Structured Systems
Development Manual, Bellin and Suchman

Understanding Computers and Cognition, Winograd

Balance of the Planet ( IBM-PC; Mac) by Chris Crawford. Ecology simulation game. Run a world,
balancing ecology against other concerns. Caveman Ugh-lympics (IBM-PC) by Greg Johnson: Role-
playing track-meet game.

EarthQuest (Mac) by EarthQuest Corp: Hypercard-stack that teaches about the planet, people and the
environment.

Faces from Spectrum Holobyte

StarControl (IBM-PC) by Greg Johnson: Space "arcade" game with detailed graphics.

StarFlight (IBM-PC) by Greg Johnson: Space adventure/exploration game.

Stunt Driver from Spectrum Holobyte Tetris (IBM PC) by Alexei Pazhitnov from Spectrum Holobyte

Welltris (IBM PC) by Alexei Pazhitnov from Spectrum Holobyte


Win a Laptop Computer And Support CPSR!

Many people join CPSR to find "like-minded" people, and most of us know such people who aren't
members. We want to help these people find each other and the work of CPSR, so we're having a contest.
The grand prize, for the individual who gets the most new members to join, is a Toshiba laptop
computer. Other prizes include a year subscription to Esther Dyson's Release 1.0 or a choice of ten
books, including a signed copy of Donald Norman's The Psychology of Everyday Things.

HAVE WE LOST OUR MINDS?

We hope not. We simply decided that sending out direct mail appeals to people wasn't working. Everyone
is swamped with direct mail. It's also expensive for the organization, and we'd prefer not to spend your
money on mailings. So we're trying this contest, with the hope that the prizes will motivate CPSR
members and others to bring new members to the organization by word of mouth. Please think about
helping build CPSR, and maybe you will win some software, a book, a subscription to a magazine, or
even a computer.

CONTEST DETAILS ARE ON THE FACING PAGE

Miscellaneous


The CPSR Newsletter is published quarterly by:

Computer Professionals for Social Responsibility P.O. Box 717
Palo Alto, CA 94301 (415) 322-3778

Also located at: 1025 Connecticut Ave., N.W., #1015 Washington, D.C. 20036 (202) 775-1588

The purpose of the Newsletter is to keep members informed of thought and activity in CPSR. We
welcome comments on the content and format of our publication. Most especially, we welcome
contributions from our members. Deadline for submission to the next issue is October 31, 1990.

This Newsletter was produced on an Apple Macintosh II, using the desktop publishing program
Pagemaker 3.0 and Adobe Illustrator 88. The hardware and software used were donated to CPSR by
Apple Computer, the Aldus Corporation, and Adobe Systems. The Newsletter was typeset from a
Pagemaker file on a Linotronic 100.


Burkholder at Carnegie Mellon Guest Edits This Newsletter

This issue of The CPSR Newsletter was guest edited by Dr. Leslie Burkholder, a member of CPSR/
Pittsburgh. Dr. Burkholder is a research scientist at the Center for Design of Educational Computing
and Software Manager of Academic Computing at Carnegie Mellon University. He is also co-editor, with
Stephen Read of the University of St Andrews, Scotland, of the journal Philosophy & Computing. He
organized the articles in this issue and kept after the authors to get their manuscripts in. Many thanks
to Leslie for the hours he put into putting this newsletter together.

Archived CPSR Information
Created before October 2004
Announcements

Sign up for CPSR announcements emails

Chapters

International Chapters -

> Canada
> Japan
> Peru
> Spain
          more...

USA Chapters -

> Chicago, IL
> Pittsburgh, PA
> San Francisco Bay Area
> Seattle, WA
more...
Why did you join CPSR?

I strongly support the work of CPSR in humanizing computer technology.