Science, Technology and Society: A Philosophical Perspective
Science, Technology and Society: A Philosophical Perspective
Science, Technology and Society: A Philosophical Perspective
A Philosophical Perspective
Wenceslao J. Gonzalez (Editor)
General Editor
Wenceslao J. Gonzalez
ISBN: 0-9729892-2-6
Preface: The Relevance of Science, Technology and Society: The “Social Turn”
Wenceslao J. Gonzalez......................................................................................................... ix
PART II: STS: FROM THE PRESENT SITUATION TO THE FUTURE PROJECTION
iii
ABOUT THE CONTRIBUTORS
TO THIS VOLUME
v
vi Science, Technology and Society: A Philosophical Perspective
is the author of eight books, among them are Mundo, Tecnología y Razón en el
fin de la Modernidad (1993), K. Popper, de la Epistemología a la Metafísica
(1996), Razionalità tecnica e mondo futuro. Una eredità per il terzo millennio
(2002), and Ética, Tecnología y Valores en la sociedad global (2003). He has
published widely. Among his papers are “Does Technology ‘Construct’ Scientific
Reality?” (1993), “Hypothèse, objectivité, et rationalité technique” (1996), “Since
Indeterminacy: The New Picture of the Physical World at the End of Modernity”
(1997), “Technology as a New Condition of the Possibility of Scientific Knowledge”
(1998), “Scientific Realism, Objectivity and Technological Realism” (2000), and
“Science as Technoscience: Values and Their Measurement” (2004).
THE R ELEVANCE OF SCIENCE, TECHNOLOGY
AND SOCIETY: THE “SOCIAL TURN”
Wenceslao J. Gonzalez
The emphasis on the realm of Science, Technology and Society or Science and
Technology Studies may have the same degree of relevance that the “historical
turn” had in the past. It is a “social turn” which affects philosophy of science as
well as philosophy of technology. It includes a new vision of the aims, processes
and results of scientific activities and technological doings, because the focus
of attention is on several aspects of science and technology which used to be
considered as secondary, or even irrelevant. This turn highlights science and
technology as social undertakings rather than intellectual contents.
According to this new vision, there are several important changes as to what
should be studied –the objects of research–, how it should be studied –the method–
and what the consequences for those studies are. The new focus of attention can
be seen in many changes, and among them are several of special interest: a) from
what science and technology are in themselves (mainly, epistemic contents) to
how science and technology are made (largely, social constructions); b) from the
language and structure of basic science to the characteristics of applied science
and the applications of science; c) from technology as a feature through which
human beings control their natural surroundings (a step beyond “technics” due to
the contribution of science) to technology as a social practice and an instrument
of power; and d) from the role of internal values necessary for “mature science”
and “innovative technology” to the role of contextual or external values (cultural,
political, economic …) of science and technology.
This “social turn” is a move that covers a larger area and introduces a
more radical scope than the preceding “historical turn”, which was developed
predominantly in the sixties and the seventies. On the one hand, STS enlarges
the domain in comparison with the contributions made by Thomas Kuhn,
Imre Lakatos, Larry Laudan … The role of historicity as a crucial element
for the philosophical approach was analyzed mostly in the case of science. De
facto, the major philosophers of that period paid little attention to technology.
Furthermore, technology was customarily seen by them as an instrument that
science uses for observation or experimentation. On the other hand, STS brings
with it a more radical scope than the “historical turn,” because that conception
–including The Structure of Scientific Revolutions– still assumes that the
internal contents of science have more weight than the external factors (social,
cultural, political, economic …).
In addition, there is a further enlargement introduced by the “social turn” in
comparison with the “historical turn.” STS considers the contributions of several
disciplines, among them practical ethics, policy analysis, legal studies, sociology
ix
x Science, Technology and Society: A Philosophical Perspective
Wenceslao J. Gonzalez
Professor of Logic and Philosophy of Science
I
Theoretical Framework
1. The Philosophical Approach to Science, Technology and Society
2. Objectivity and Professional Duties Regarding Science and Technology
THE PHILOSOPHICAL APPROACH TO SCIENCE,
TECHNOLOGY AND SOCIETY
Wenceslao J. Gonzalez1
1. AN INTERDISCIPLINARY ENDEAVOR
Science, Technology and Society or Science and Technology Studies are two
ways of referring to an interdisciplinary endeavor. STS combines the contributions
of several disciplines and, accordingly, it uses different methodologies. Its object
is not an isolated realm analyzed by a traditional kind of research, because it
depends on views on science and technology developed in the last four decades.
1
I am grateful to Kristin Shrader-Frechette for her comments on this paper.
3
4 Science, Technology and Society: A Philosophical Perspective
Indeed, STS has received increasing attention since the mid-1980’,2 when the
discussion included explicitly a third term: “technoscience.” It is also a period
where philosophy of technology increased progressively its presence in the
realm of STS,3 connecting technology with new areas for philosophical research
(issues related to bioethics, environmental concerns, social problems, policy
discussions …).4
Since the constitution of STS, both philosophy of science and philosophy of
technology have had a key role in this contemporary field. Their contributions
are interconnected with contents of other disciplines. De facto, STS is a broad
intellectual enterprise where several disciplines are involved: practical ethics,
policy analysis, law, sociology, economics … The reason for this wide variety of
contributions is clear: STS cannot be reduced to the theoretical study of science
and technology, because it includes also a practical dimension as well as a social
concern. In Europe the first aspect is still dominant, whereas in the United States
the second facet has a central relevance.
Both names –Science, Technology and Society and Science and Technology
Studies– are commonly used for the same subject matter. The sense of these
expressions includes the assumption of science and technology as human activities
in a social setting rather than two forms of mere knowledge. And the specific
reference of these expressions goes beyond the intellectual outcomes or products
of science and technology: it looks for those concrete components of science and
technology which have repercussions in social life in different dimensions (ethical,
political, sociological, economic…) Therefore, STS pays special attention to the
empirical ingredients of both researches –scientific and technological–: it seeks
their links to the lives of the citizens. Thus, the philosophical approach goes along
with other aspects in several contexts (enviromental, political, legal, sociological,
economic …) which should be considered as well.
In many ways philosophy of science and philosophy of technology are at
the core of STS, because either the other disciplines are deeply embedded in
the philosophical approach or they have at least a clear connection with some
philosophical problems. Thus, insofar as there is this common ground –the
philosophical roots– in this field, Science, Technology and Society or Science
2
Some of the most influential views on STS had already started before the mid-1980’s, cf. BARNES,
B., Scientific Knowledge and Sociological Theory, Routledge and K. Paul, London, 1974; LATOUR,
B. and WOOLGAR, S., Laboratory Life: The Social Construction of Scientific Facts, Princeton
University Press, Princeton (NJ), 1979; K NORR-CETINA, K., The Manufacture of Knowledge. An
Essay on the Constructivist and Contextual Nature of Science, Pergamon Press, Oxford, 1981; and
COLLINS, H. M., Frames of Meaning: The Sociological Construction of Extraordinary Science,
Routledge and K. Paul, London, 1982.
3
Cf. IHDE, D., “Has the Philosophy of Technology Arrived? A State-of-the-Art Review,” Philosophy
of Science, v. 71, n. 1, (2004), pp. 117-131.
4
Cf. SCHARFF, R. C. and DUSEK, V. (eds.), Philosophy and Technology: The Technological Condition,
Blackwell, Oxford, 2003.
The Philosophical Approach to Science, Technology and Society 5
7
Both philosophy of science and philosophy of technology have an ethical dimension, which
will be pointed out later on in this paper. But bioethics and environmental ethics have received
increasing attention from professionals related to health sciences (medicine, nursing, …) and
sciences connected with the environment (ecology, forestry, …). Thus, they study more specific
details (mainly in the sphere of the consequences of human actions) than philosophy of science
and philosophy of technology.
8
For Kristin Shrader-Frechette, the political analyses of technology are a central part of the
philosophy of technology and she criticizes the attempt to reduce technology to epistemology, cf.
SHRADER-FRECHETTE , K., “Reductionist Philosophy of Technology: Stones Thrown from Inside a
Glass House,” Techné. Journal of the Society for Philosophy and Technology, v. 5, n. 1, (1999),
pp. 32-43.
9
On the status and characteristics of economics of science, cf. GONZALEZ, W. J., “De la Ciencia de la
Economía a la Economía de la Ciencia: Marco conceptual de la reflexión metodológica y axiológica,”
in AVILA, A., GONZALEZ, W. J. and MARQUES, G. (eds.), Ciencia económica y Economía de la Ciencia:
Reflexiones filosófico-metodológicas, FCE, Madrid, 2001, pp. 11-37; especially, pp. 20-22.
For the economic views on technological change, cf. NELSON, R. R. and WINTER, S. G., An
Evolutionary Theory of Economic Change, Belknap Press, Cambridge, 1982, and FREEMAN, C. and
SOETE, L., Economics of Industrial Innovation, 3ª ed., The MIT Press, Cambridge, 1997. On the
determinants and directions of technological change, cf. DOSI, G., “Technological Paradigms and
Technological Trajectories,” Research Policy, v. ll, (1982), pp. 147-162.
10
Cf. GONZALEZ, W. J., “Racionalidad y Economía: De la racionalidad de la Economía como Ciencia
a la racionalidad de los agentes económicos,” in GONZALEZ, W. J. (ed.), Racionalidad, historicidad y
predicción en Herbert A. Simon, Netbiblo, A Coruña, 2003, pp. 65-96.
The Philosophical Approach to Science, Technology and Society 7
studies will be more fruitful insofar as they consider the distinction between
science and technology, since it is relevant not only in theoretical terms but also
in practical terms (aims, processes and results).
14
Cf. ECHEVERRIA, J., La revolución tecnocientífica, FCE, Madrid, 2003.
15
An interesting case is IHDE, D. and SELINGER, E. (eds.), Chasing Technoscience: Matrix for
Materiality, Indiana University Press, Bloomington, 2003.
16
Cf. PRICE, D. J. DE SOLLA, Little Science, Big Science, Columbia University Press, N. York, 1963.
17
Cf. LATOUR, B., Science in Action: How to Follow Scientists and Engineers Through Society,
Harvard University Press, Cambridge (MA), 1987.
18
Cf. HOTTOIS, G., Le paradigme bioéthique: une éthique pour la technoscience, De Boeck-Wesmael,
Brussels, 1990.
The Philosophical Approach to Science, Technology and Society 9
19
Donna Haraway, “under her earlier figure of cyborg, sees technoscience as the full hybridization
of science and technology,” IHDE, D., “Has the Philosophy of Technology Arrived? A State-of-the-
Art Review,” p. 121. Cf. H ARAWAY, D., Simians, Cyborgs and Women: The Reinvention of Nature,
Routledge and Institute for Social Research and Education, N. York, 1991.
20
Technoscience understood as hybridrization or symbiosis of science and technology suggests
examples, such as the interaction of computer science and technology of information and
communication, which lead to products popularly called “new technologies,” where the patents
are on properties different from those obtained by previous technologies. Cf. ECHEVERRIA, J., La
revolución tecnocientífica, pp. 64-68 and 71-72.
10 Science, Technology and Society: A Philosophical Perspective
approach. He has proposed five different models for consideration, which take
into account the views that have been more influential in the relations between
science and technology.21
1) Technology is reducible to science (i.e., technology depends ontologically
on science), which means that either it is applied science or is an application of
science.22 2) Science is reducible to technology (i.e., science depends ontologically
on technology), which can be seen as an instrumentalist position insofar as science
appears as an instrument to dominate nature through technology (a view held by
some philosophies focused on praxis, such as different versions of pragmatism,
Marxism … or even nihilism). 3) There is an identity of science and technology.
This thesis is a way of understanding “technoscience,” but is so strong that
even its supporters –mainly constructivists– 23 try to emphasize the identity in
methodological terms –as a common process– rather than in ontological terms
(as being the same entity). 4) Science and technology are independent both
ontologically and causally. It is a parallelist view: they move according to the
same rhythm but without interaction.24 5) There is an ontological independence
between science and technology, but they are in a causal interaction.
This last option of Niiniluoto’s models –the interactionist view– is also
a version of “technoscience.” It is a sound conception because it respects the
conceptual difference between “science” and “technology.” On the one hand,
commonly science and technology have different aims, processes and results
(i.e., outcomes or products). Thus, they have theoretical as well as empirical
differences. But, on the other hand, science and technology are interconnected
in many ways, as history has shown us, at least since the XVII century (as can
be seen in cases such as the construction of the telescope and the knowledge of
satellites). In this regard, there is an interesting metaphor: they are like two legs
of the same body.25 Therefore, to accept technoscience in the second sense –as an
interaction– could be compatible with notions both of “science” and “technology.”
Nevertheless, both of them should also be charactized in order to have a clear
account of the triadic distinction among science, technology, and technoscience.
Defending the idea of the difference between the three of them requires us to
insist on science as a complex reality which condenses a trajectory of centuries
and it is open to improvement in the future. Thus, the characteristics of a science
21
Cf. NIINILUOTO, I., “Ciencia frente a Tecnología: ¿Diferencia o identidad?,” Arbor, v. 157, n. 620,
(1997), pp. 285-299; especially, pp. 287-291.
22
Cf. BUNGE, M., “Technology as Applied Science,” Technology and Culture, v. 7, (1966), pp. 329-
347. For D. Ihde, “Bunge’s take on technology and its relation to science, turns out to be nearly
identical with Martin Heidegger’s,” I HDE, D., “Has the Philosophy of Technology Arrived? A State-
of-the-Art Review,” p. 118.
23
It should be pointed out that constructivism can be developed in a large number of directions, cf.
H ACKING, I., The Social Construction of What?, Harvard University Press, Cambridge (MA), 1999.
24
Cf. PRICE, D. J. DE SOLLA, “Is Technology Historically Independent of Science? A Study in
Statistical Historiography,” Technology and Culture, v. 6, (1965), pp. 553-568.
25
Cf. R ESCHER, N., Razón y valores en la Era científico-tecnológica, Paidós, Barcelona, 1999.
The Philosophical Approach to Science, Technology and Society 11
are not simple, but they can be enumerated basically in several elements: i) science
possesses a specific language (with terms whose sense and reference are precise);
ii) science is articulated in scientific theories with a well patterned internal
structure, which is open to later changes; iii) science is a qualified knowledge
(with more rigor –in principle– than any other knowledge); iv) it consists of an
activity that follows a method (normally it is deductive, although some authors
accept the inductive method) 26 and it appears as a dynamic activity (of a self-
corrective kind, which seeks to increase the level of truthlikeness).
Apart from these characteristics, there are others which have been
emphasized in recent times: v) the reality of science comes from social action,
and it is an activity whose nature is different from other activities in its
assumptions, contents and limits; vi) science has aims –generally, cognitive
ones– for guiding its endeavor of researching (in the formal sphere and in the
empirical realm); and vii) it can have ethical evaluations insofar as science is
a free human activity: values which are related to the process itself of research
(honesty, originality, reliability …) or to its nexus with other activities of human
life (social, cultural …).27
These characteristics of science are connected to a kind of rationality
which is different from technological rationality,28 because the aims, processes
and results that, in principle, science and technology seek are different. Thus,
scientific rationality has several aims, mainly in the cognitive sphere, and they
can be pursued in order to increase our knowledge (basic science) or to resolve
practical problems in a concrete area (applied science).29 Meanwhile technological
rationality is oriented towards a creative transformation of reality, either natural
or social, according to a design, which is followed by an activity and a posterior
artifact (or final product).
Etymologically, “technology” is a kind of knowledge insofar as it is the
logos (the doctrine or learning) of the techné30 (either in the realm of “arts”
–to create beautiful objects– or in the sphere of “technics”– to build useful
items–). In addition, technology is a social activity which is developed in an
intersubjective doing in order to transform the previous reality (natural or social),
based on scientific knowledge as well as specific technological knowledge. As
a consequence of this process, there is an expected product which should be
tangible: a visible artifact or a new kind of social reality. This final product of
26
Cf. NIINILUOTO, I. and TUOMELA, R., Theoretical Concepts and Hypothetico-Inductive Inference,
Reidel, Dordrecht, 1973.
27
On these seven elements of science, cf. GONZALEZ, W. J., “De la Ciencia de la Economía a la
Economía de la Ciencia: Marco conceptual de la reflexión metodológica y axiológica,” p. 15.
28
Cf. GONZALEZ, W. J., “Racionalidad científica y racionalidad tecnológica: La mediación de la
racionalidad económica,” Agora, v. 17, n. 2, (1998), pp. 95-115.
29
Cf. NIINILUOTO, I., “The Aim and Structure of Applied Research,” Erkenntnis, v. 38, (1993),
pp. 1-21.
30
Cf. MITCHAM, C., “Philosophy of Technology,” in DURBIN, P. (ed.), A Guide to the Culture of
Science, Technology and Medicine, The Free Press, N. York, 1980, pp. 282-363.
12 Science, Technology and Society: A Philosophical Perspective
technology ight be registered in a patent, which could hardly be the case in the
final outcome of science (even in applied science).31
Yet technology is more than knowledge used in a transformative way to
get a final product, because it includes a variety of components. 1) Technology
has its own language, due to its attention to internal constituents of the process
(design, effectiveness, efficiency …) and external factors (social, economic,
political, cultural …). 2) The structure of technological systems is articulated
on the basis of its operativity, because it should guide the creative activity of the
human being that transforms nature (or the human and social reality). 3) The
specific knowledge of the technological activity –know how– is instrumental
and innovative: it seeks to intervene in an actual realm, to dominate it and to
use it in order to serve human agents and society. 4) The method is based on an
imperative-hypothetical argumentation. Thus, the aims are the key to making
reasonable or to rejecting the means used by the technological process. 5) There
are values accompanying that process, which could be internal (to realize the
goal at the lowest possible cost) and external (ethical, social, political, ecological,
etc). These values condition the viability of the possible technology and its
alternatives. 6) The reality itself of the technological process is supported by
social human actions which have an intentionality and are oriented towards the
transformation of the surrounding reality.32
Therefore, technology can be seen as an attempt to direct a human activity
to obtain a creative and transformative domain of that reality –natural or human
and social– on which it is working. Primarily, it does not seek to describe or
to explain reality, because there is already a discovered reality (i.e., known to
some extent) which technology wants to change. This domain appears in new
designs and in the effectiveness-efficiency pair, but it also requires us to consider
a large number of aspects related to this activity (ethical, economic, ecological,
political, cultural, etc). Thus, even though a technology may achieve its aims as
such (i.e., effectiveness), it might not be acceptable from the point of view of other
factors, such as economic criteria (e.g., the cost-benefit ratio), ethical values (e.g.,
consent, fairness), ecological effects (e.g., the contamination of rivers), political
consequences (e.g., the decrease of civil liberties) or incompatibility with the
dominant culture.
Central to this account about technoscience, science and technology is the
need for a clear distinction between them. This goal seems necessary in order
to clarify the contents of the studies on science and technology (and, hence, for
31
Usually, the outcomes of science are public and have free access to users, whereas the products of
technology can have a patent and, therefore, they could be private and with no free access for users.
32
Cf. GONZALEZ, W. J., “Progreso científico e innovación tecnológica: La ‘Tecnociencia’ y el
problema de las relaciones entre Filosofía de la Ciencia y Filosofía de la Tecnología,” Arbor, v. 157,
n. 620, (1997), p. 266.
The Philosophical Approach to Science, Technology and Society 13
the relations between science, technology and society). The relevance of the
distinction is not merely of a conceptual kind because it has also a practical
dimension. In effect, each one of them –technoscience, science and technology–
is social from at least two points of view: i) all are developed by a community
of researchers, and they should have a neat notion of their work (aims, processes
and results); and ii) insofar as technoscience, science and technology are human
undertakings, they belong to a social setting which can have repercussions at the
different levels (aims, processes and results). The philosophical approach should
consider the internal constituents as well as their external factors; and due to the
characterization of “technoscience” as a type of interaction between science and
technology, the focus here will be on these cases.
33
Cf. H AACK, S., Manifesto of a Passionate Moderate, The University of Chicago Press, Chicago,
1998; KOERTGE, N. (ed.), A House Built on Sand: Exposing Postmodern Myths about Science,
Oxford University Press, N. York, 1998; and SOKAL, A. and BRICMONT, J., Intellectual Impostures.
Postmodern Philosophers’ Abuse of Science, Profile Books, London, 1998.
14 Science, Technology and Society: A Philosophical Perspective
and historicity– lead to another key element: the decision making of the scientific
community requires us to take into account social and political factors, not
only the internal constituents of science. Furthermore, the responsibility of the
scientist goes beyond the endogenous aspects (honesty, originality, reliability …)
of scientific activity to reach the exogenous elements. Thus, the aims, processes
and results are not mere individual ingredients but rather social factors.
Posterior philosophical views, such as some versions of naturalism of
sociological roots and most of the postmodern conceptions of science, have
emphasized the sociological dimension of scientific activity. The existence in
STS of three very influential positions within the sociology of science should
be pointed out: a) the “Strong Program” of the Edinburgh school led by Barry
Barnes35 –now at the University of Exeter– and David Bloor,36 based on some
Kuhnian as well as Wittgensteinian ideas; b) the empirical program of relativism
(EPOR) developed by Harry Collins,37 which studies scientific controversies
with an interpretative flexibility and analyzes their connections to the socio-
cultural milieu; and c) the ethnomethodology –the study of the actors’ network
at the workplace– defended by the social constructivism of Bruno Latour38 and
Steve Woolgar.39
These schools of sociology of science assume post-Kuhnian views on the role of
the context of scientific knowledge which affects the vision of the scientific objects
of research. They highlight science from an external point of view: as a social
practice where the “epistemic contents” are really “social factors” –the Strong
Program; as a self-referential project within a constructivist framework – the
empirical program of relativism; and as a social construction based on individual
35
Cf. BARNES, B., Interests and the Growth of Knowledge, Routledge and K. Paul, London, 1977;
BARNES, B., T. S. Kuhn and Social Science, Macmillan, London, 1982 (Columbia University Press,
N. York, 1982); BARNES, B., The Elements of Social Theory, Princeton University Press, Princeton,
1995; and BARNES, B., BLOOR, D. and HENRY, J., Scientific Knowledge. A Sociological Analysis, The
University of Chicago Press, Chicago, 1996.
36
Cf. BLOOR, D., “Wittgenstein and Mannheim on the Sociology of Mathematics,” Studies in History
and Philosophy of Science, v. 4, (1973), pp. 173-191; BLOOR, D., “Popper’s Mystification of Objective
Knowledge,” Science Studies, v. 4, (1974), pp. 65-76; BLOOR, D., Knowledge and Social Imagery,
Routledge and K. Paul, London, 1976 (2nd ed., The University of Chicago Press, Chicago, 1991);
BLOOR, D., Wittgenstein: A Social Theory of Knowledge, Macmillan, London, 1983; and BLOOR, D.,
Wittgenstein, Rules and Institutions, Routledge, London, 1997.
37
Cf. COLLINS, H. M., “An Empirical Relativist Programme in the Sociology of Scientific Knowledge,”
in K NORR-CETINA, K. D. and MULKAY, M. (eds.), Science Observed: Perspectives in the Social Study
of Science, Sage, London, 1983, pp. 85-100; and COLLINS, H. M. and PINCH, T., The Golem: What
Everyone Should Know About Science, Cambridge University Press, Cambridge, 1993.
38
Cf. LATOUR, B. and WOOLGAR, S., Laboratory Life: The Social Construction of Scientific Facts,
2nd ed., Princeton University Press, Princeton (NJ), 1986; LATOUR, B., The Pasteurisation of
France, Harvard University Press, Cambridge, MA, 1988; and LATOUR, B., We have Never been
Modern, Harvester, Brighton, 1993 (translated by C. Porter.)
39
Cf. WOOLGAR, S., “Critique and Criticism: Two Readings of Ethnomethodology,” Social Studies
of Science, v. 11, n. 4, (1981), pp. 504-514; WOOLGAR, S., Science: The Very Idea, Tavistock, London,
1988; and WOOLGAR, S. (ed.), Knowledge and Reflexivity: New Frontiers in the Sociology of
Knowledge, Sage, London, 1988.
16 Science, Technology and Society: A Philosophical Perspective
47
Cf. KOERTGE, N., “’New Age’ Philosophies of Science: Constructivism, Feminism and
Postmodernism,” The British Journal for the Philosophy of Science, v. 51, (2000), pp. 667-683.
48
I HDE, D., “Has the Philosophy of Technology Arrived? A State-of-the-Art Review,” p. 118.
49
Cf. ORTEGA Y GASSET, J., Ensimismamiento y alteración. Meditación de la Técnica, Espasa-Calpe,
Buenos Aires, 1939 (originally written in 1933.) Reprinted in ORTEGA Y GASSET, J., Meditación de la
Técnica, Santillana, Madrid, 1997.
50
Cf. HEIDEGGER, M., “Die Frage nach der Technik,” in HEIDEGGER, M., Vorträge und Aufsätze,
Günther Neske, Pfullingen, 1954, pp. 13-44. Translated as HEIDEGGER, M., “The Question
Concerning Technology,” in SCHARFF, R. C. and DUSEK, V. (eds.), Philosophy and Technology: The
Technological Condition, pp. 252-264.
51
Cf. JASPERS, K., Die Atom-bombe und die Zukunft der Menschen, Piper, Munich, 1958.
52
Cf. GEHLEN, A., “Anthropologische Ansicht der Technik,” in FREYER , H., PAPALEKAS, J. CH. and
WEIPPERT, G. (eds.), Technik im technischen Zeitlater, J. Schilling, Düsserdorf, 1965. Translated in
abridged version as GEHLEN, A., “A Philosophical-Anthropological Perspective on Technology,” in
SCHARFF, R. C. and DUSEK , V. (eds.), Philosophy and Technology: The Technological Condition,
pp. 213-220.
18 Science, Technology and Society: A Philosophical Perspective
53
Ronald Giere wrote then, when Kuhn and Lakatos were at a peak of their careers, that “the
methodology of technology is philosophically nearly a virgin territory,” GIERE, R. N., “The
Structure, Growth and Application of Scientific Knowledge: Reflections on Relevance and Future
of Philosophy of Science,” in BUCK, R. C. AND COHEN, R. S. (eds.), In Memory of R. Carnap, Reidel,
Dordrecht, 1971, p. 544.
The bibliography of initial stages of philosophy and methodology of technology can be found in
MITCHAM, C. and M ACKEY, R. (eds.), Bibliography of the Philosophy of Technology, The University
of Chicago Press, Chicago, 1973.
54
Among the important works in philosophy of technology, there are some which are interesting
for the present purpose: SKOLIMOWSKI, H., “The Structure of Thinking in Technology,” Technology
and Culture, v. 7, (1966), pp. 371-383; and R APP, F., Analitische Technikphilosophie, K. Alber,
Munich, 1978.
In addition to the anthologies already pointed out, it should be mentioned here R APP, F. (ed.),
Contributions to a Philosophy of Technology, Reidel, Dordrecht, 1974; DURBIN, P. and R APP, F.
(eds.), Philosophy and Technology, Reidel, Dordrecht, 1983; and FELLOWS, R. (ed.), Philosophy and
Technology, Cambridge University Press, Cambridge, 1995.
55
Cf. STRÖKER, E., “Philosophy of Technology: Problems of a Philosophical Discipline,” DURBIN, P.
and R APP, F. (eds.), Philosophy and Technology, pp. 323-336; specially, p. 323.
56
“Has the Philosophy of Technology Arrived? A State-of-the-Art Review,” p. 124.
The Philosophical Approach to Science, Technology and Society 19
relevant than the external factors,62 whereas the second and the third options tend
to emphasize the external factors.
It may be fruitful to focus philosophy of technology explicitly on the elements
already pointed out (language, system, knowledge, method, human activity, aims
and values). If we develop the “internal” constituents in a philosophical study
more similar to the philosophy of science, then it can be easier to understand
the “external” factors. The philosophical reflection on technological doing can
analyze the semantic, structural, epistemological, methodological, ontological,
axiological (internal values) and evaluative (external values) aspects of technology.
Generally, the attention goes to epistemological, methodological, ontological
and evaluative considerations. It seems reasonable to think that to clarify the
technological activity itself –the internal perspective– can be the initial step to
pondering the social dimension of technology –the external view– which has
many consequences and manifestations (technology as a crucial factor for social
change, as an instrument for political power, as a means for the transformation of
the ecosystem, etc).
64
An analysis of this issue can be made on the basis of action theory. In this regard, cf. TUOMELA, R.,
“The Social Dimension of Action Theory,” Daimon. Revista de Filosofía, v. 3, (1991), pp. 145-158;
and TUOMELA, R., The Importance of Us, Stanford University Press, Stanford, 1995. In addition, it
could be of interest to consider the social ontology, cf. RUBEN, D. H., The Metaphysics of the Social
World, Routledge and K. Paul, London, 1985.
65
On this notion, cf. R ESCHER, N., “Collective Responsibility,” in R ESCHER, N., Sensible Decisions.
Issues of Rational Decision in Personal Choice and Public Policy, Rowman and Littlefield, Lanham,
2003, pp. 125-138.
66
Cf. GONZALEZ, W. J., “Progreso científico, Autonomía de la Ciencia y Realismo,” Arbor, v. 135,
n. 532, (1990), pp. 91-109.
22 Science, Technology and Society: A Philosophical Perspective
Ilkka Niiniluoto explicitly links scientific character and objectivity: “In order
to be scientific, inquiry has to be objective at least in two senses. First, the object
of investigation has to be real in Peirce’s sense, i.e., its characters should be
‘independent of what anybody may think them to be’ [Collected Papers, 5.405].
Secondly, the object should be allowed to influence the formation of the result
of inquiry, and this influence should be intersubjectively recognizable.” 72 In
addition, if basic science cannot be objective, then it will be unable to follow on
the road towards either truth or truthlikeness. And applied science, if it is not
able to work on the basis of an objective representation of the world, will have
difficulties in resolving concrete problems. Consequently, it seems a mistake of
social constructivism to dismiss objectivity in the constitutive elements of science
(language, structure, knowledge, method …).
According to these considerations, the relation between science and society
from a philosophical approach needs “internal” constituents as well as “external”
factors. Ethics of science is a good example of the necessity of both kinds of
philosophical analysis of the scientific activity –the internal and the external–73
which are better known in this case as “endogenous ethics” and “exogenous ethics.”
Both kinds of analyses are important and, to some extent, they are like two sides
of the same coin, because the free human activity of basic science requires ethical
values (honesty, responsibility, reliability …) and the social activity of applied
science also needs ethical values (due to its relations with persons, social milieu
and nature). Furthermore, ethics of science is also relevant in order to show the
differences between basic science and applied science, because there are some
problems which are specific to the second realm.74 These varieties of analyses
are relevant to the present discussions of bioethics (e.g., in the research on human
cloning) and of environmental ethics (e.g., in the contamination of rivers or
atmospheric pollution).
72
NIINILUOTO, I., Is Science Progressive?, Reidel, Dordrecht, 1984, p. 4.
73
On the importance of ethics in scientific research as such and in its social dimension, cf. AGAZZI,
E., Il bene, il male e la scienza. Le dimensioni etiche dell’impresa scientifico-tecnologica, Rusconi,
Milan, 1992; R ESNIK, D. B., The Ethics of Science, Routledge, London, 1998; and SHRADER-
FRECHETTE, K., Ethics of Scientific Research, Rowman and Littlefield, Savage (MD), 1994.
74
Cf. GONZALEZ, W. J., “Ciencia y valores éticos: De la posibilidad de la Etica de la Ciencia al
problema de la valoración ética de la Ciencia Básica,” Arbor, v. 162, n. 638, (1999), pp. 139-171.
The Philosophical Approach to Science, Technology and Society 25
science, either for the discovery of new facts or for the justification of scientific
statements; b) the characteristics of applied science as an issue that requires a
specific focus, after decades of primacy of basic science for the philosophical
approach; and c) the applications of science as a topic of special interest for
philosophy insofar as science should solve practical problems in the social
realm (economic, political, ecological …).
As to the importance of the instruments, especially their role in experiments,
there have been interesting contributions over the last two decades.75 The need for
a material support –an artifact made technologically– for scientific discoveries
and for the testability of scientific statements was in no way unknown before
(at least since Galileo’s times), but there are new views about the character of
the experiments and the contribution of the artificial objects made by the social
activity of technology. In addition, these reflections emphasize the “artificial
character” of experimentation in the laboratory insofar as there is a dependence
on instruments already thought of for some purposes. Again, we are faced with
science as social action.
Where the practical utilities do have a key role is in applied science, which
frequently includes an interaction between the scientific knowledge and the
material support given by technology. There is a clear difference with basic
science: the feature of a practical orientation of scientific knowledge. Thus,
“besides epistemic utility, the knowledge provided by applied science is expected
to have instrumental value for associated human activity. Applied science is
thus governed by what Habermas calls the ‘technical interest’ of controlling the
world.” 76 Design sciences, which belong to the sciences of the artificial,77 are a
clear example of the interest in how the things ought to be to reach some goals.78
Other conceptions in favor of the insistence on science as a practice
call attention to the applications of science. In this regard, “it is important to
distinguish applied science from the applications of science. The former is a
part of knowledge production, the latter is concerned with the use of scientific
knowledge and methods for solving practical problems of action (e.g., in
engineering or business), where may play the role of a consult.” 79 These solutions
75
There are two books that have been very influential: H ACKING, I., Representing and Intervening,
Cambridge University Press, Cambridge (MA), 1983; and GALISON, P., How Experiments End, The
University of Chicago Press, Chicago, 1987.
76
NIINILUOTO, I., “The Aim and Structure of Applied Research,” p. 6. Cf. H ABERMAS, J., Erkenntnis
und Interesse, Suhrkamp, Frankfurt, 1968.
77
The “sciences of the artificial” can be understood in two different ways. On the one hand, the
domain different from the natural sciences and the social sciences where design has a key role
(e.g., library science, pharmacology, agricultural science …) and it is usually a “scientification”
of a profession and, on the other hand, the scientific study of the properties of technology (i.e., the
research on the physical, chemical … properties of technological artifacts) such as in the case of
“engineering science.” The focus is here on the first option.
78
Cf. SIMON, H., The Sciences of the Artificial, 3rd ed., The MIT Press, Cambridge (MA), 1996.
79
“The Aim and Structure of Applied Research,” p. 9.
26 Science, Technology and Society: A Philosophical Perspective
to practical problems are more visible to the members of society than the research
that has made the solutions possible. Thus, the applications of science in applied
sciences (ecology, economics, medicine, pharmacology, nursing …) received
more analysis in STS than other disciplines. Those applications, insofar as they
are social actions of the scientists, can be analyzed at different levels (aims,
means, results, consequences) by the empirical sciences included in STS.
From a philosophical point of view, there is again the need to consider the
“internal” and “external” aspects. In this regard, one issue of interest is the relation
between possible practical success and the cognitive content of the scientific theory
used in applied science. To establish “practical success” is clearly more difficult in
the case of social sciences than in natural sciences (as can be see frequently in the
discussions of contributions of Nobel Prizes in Economics). Niiniluoto suggests
using the case of ballistics. It is an applied science heavily linked to technology.
He maintains that “practical success does not prove the truth of a theory. … But
if Newton’s theory were completely mistaken, it would be difficult to understand
how it can achieve successful concretization. For this reason, the practical success
of a theory is an indicator of its truthlikeness.” 80 This aspect is not considered by
social constructivism, and one it seems convenient to keep in mind in order to
make decisions on social problems connected with science.
consequences for the citizens which are more visible than the enlargement of
human knowledge (basic science) or even the solution to practical problems
(applied science). The reason is clear: technology is oriented towards the creative
transformation of the reality. Thus its design looks to change existing reality
(natural, social, or artificial) to produce new results (a kind of human artifact:
bridge, airplane, computer, cell phone …) which can affect directly the lives of
the members of society. These changes might be in favor of social development
or –as the present book points out in several chapters– they may be against the
common good of citizens.88
Certainly the social dimension appears in the three main stages of the
technological doing. 1) It intervenes in the design, because technology not only
uses scientific knowledge (know that) and specific technological knowledge
(know how) but also takes into account social and economic values in the
design. This is clear in many technological innovations (new cell phones, faster
computers, large airplanes …) that should consider the users of the product and the
potential economic rentability of the new artifact. 2) The technological process is
developed in enterprises –public or private– organized socially according to some
values (economic, cultural, ergonomic, aesthetic …) and with an institutional
structure (owners, administrators …) 3) The final result of technology is a
human-made product –an artifact– to be used by society and it has an economic
evaluation in the market. Hence, it can be said that technology is ontologically
social as a human doing. In addition, its product is an item for society. Moreover,
the criteria of society have a considerable influence in promoting some kind
of innovations (with their patents) or an alternative technology (a new design,
process and product).
Frequently, the social dimension of technology is viewed with concern,
especially in the case of recent phenomena related to industrial plants (e.g., in
accidents related to nuclear energy). But it is also an attitude that appears many
times under the reflection on the limits of technology, when philosophy asks for
the bounds (Grenzen) of technology. These terminal limits of technology should
take into account the internal values as well as the external values (ethical,
social, cultural, political, ecological, aesthetic, economic …). And philosophy of
technology considers the external values in the context of a democratic society
interested in the well-being of the citizens,89 thinking that their members can
88
There are also reflections on present phenomena in comparison to the past, such as in the case of
the Luddites, cf. GRAHAM, G., The Internet: A Philosophical Inquiry, Routledge, Londres, 1999, ch.
1; and K ITCHER, PH., Science, Truth, and Democracy, Oxford University Press, Oxford, 2001, ch.
13, pp. 167-180.
89
Cf. NIINILUOTO, I., “Límites de la Tecnología,” Arbor, v. 157, n. 620, (1997), pp. 391-410. In this
regard, on the relation about technological rationality and human happiness, cf. R ESCHER, N., Razón
y valores en la Era científico-tecnológica, ch. 8, pp. 169-190.
The Philosophical Approach to Science, Technology and Society 29
Both sides –internal and external– are needed in order to clarify the
technological processes (in themselves as well as in their historical dynamics).94
From the internal point of view, the methodology of technology has a central role. It
is based on an imperative-hypothetical argumentation, where the aims are crucial
to making reasonable or to rejecting the means used by the process of developing
a technological artifact. And, from an external perspective, technology requires
social values as human undertaking: the technological processes cannot be
beyond social control, because society has the right to look for a reasonable ethics
of technology and it can seek a rational technological policy for its citizens.
Two different philosophical orientations might be considered here about the
process in technology: i) technological determinism assumes that the development
of technology is uniquely determined by internal laws; and ii) technological
voluntarism maintains that the change can be externally directed and regulated
by the free choice of the members of the society. On the one hand, technological
determinists can argue that the development of technology is de facto a complex
system process where the imperatives have a role (at least, methodologically);
but, on the other hand, technological voluntarists can point out that the citizens do
not have to obey eo ipso those imperatives. Niiniluoto suggests a middle ground
between “determinism” and “voluntarism:” the commands of technology are
always conditional, because they are based on some value premises, and then it
is correct that we do not need to obey technological imperatives. Therefore, the
principle that “can implies ought” is not valid insofar as not all technological
possibilities should be actualized.95
“Sustainable development” is an important notion in this regard, because
it is related to multiple technological processes. Furthermore, it connects with
the analysis of what kind of technological possibilities should be actualized.
Sustainable development combines internal terms –as an epistemic concept–
and external ones, due to the social consequences of linking human beings with
technology and their being interwoven with the natural environment. It is a
notion that includes empirical contents (some of them related to applied sciences)
and value premises (social, cultural, political, economic …). But “sustainable
development” raises the relevant question of the development of technological
processes which can cause damage to the nature.
94
The historical dynamics of technology requires to consider the evolutionary changes (the
improvements in off-shore platforms, aircrafts, automobiles, …) and the “technological revolutions”
(such as the computers). An analysis of the second ones is in SIMON, H. A., “The Steam Engine
and the Computer: What makes Technology Revolutionary,” EDUCOM Bulletin, v. 22, n. 1, (1987),
pp. 2-5. Reprinted in SIMON, H. A., Models of Bounded Rationality. Vol. 3: Empirically Grounded
Economic Reason, The MIT Press, Cambridge (MA), 1997, pp. 163-172.
95
Cf. NIINILUOTO, I., “Should Technological Imperatives Be Obeyed?,” International Studies in the
Philosophy of Science, v. 4, (1990), pp. 181-187.
The Philosophical Approach to Science, Technology and Society 31
96
According to Niiniluoto, that is the first definition of “sustainable development,” and it was
used in the report “World Conservation Strategy,” published by the International Union for the
Conservation of Nature and Natural Resources, Gland (Switzerland), 1980.
97
Cf. NIINILUOTO, I., “Nature, Man, and Technology - Remarks on Sustainable Development,” in
HEININEN, L. (ed.), The Changing Circumpolar North: Opportunities for Academic Development,
Arctic Centre Publications 6, Rovaniemi, 1994, p. 76.
98
Cf. VON WRIGHT, G. H., Norm and Action, Routledge and K. Paul, London, 1963.
99
NIINILUOTO, I., “Nature, Man, and Technology - Remarks on Sustainable Development,” pp. 80-81.
32 Science, Technology and Society: A Philosophical Perspective
on the ends.100 Among the values to be considered are the social values and those
ingredients are a guarantee of a better protection of the environment.
103
Cf. SIMON, H. A., “Economics as a Historical Science,” Theoria, v. 13, n. 32, (1998), pp. 241-260.
104
Cf. NIINILUOTO, I., “Límites de la Tecnología,” p. 392.
34 Science, Technology and Society: A Philosophical Perspective
regulatory science,” José Luis Luján (University of the Balearic Islands), and
“How to Reform Science and Technology,” Kristin Shrader-Frechette (University
of Notre Dame). 3. The Relation between Science and Society: “Progress and
Social Impact in Design Sciences,” Anna Estany Profitos (Autonomous University
of Barcelona), and “Experiments, Instruments and Society: Radioisotopes in
Biomedical Research,” María Jesús Santesmases (Higher Council of Scientific
Research). 4. The Nexus between Technology and Society: “Philosophical
Patterns of Rationality and Tecnological Change,” Ramón Queraltó Moreno
(University of Seville).
Originally, these papers were delivered at the Jornadas sobre Ciencia, Tec-
nología y Sociedad: La perspectiva filosófica (Conference on Science, Technology
and Society: A Philosophical Perspective), organized by the University of A Coruña
with the support of the Society of Logic, Methodology and Philosophy of Science
in Spain. The meeting was held at the Campus of Ferrol on 11th and 12th of March
2004. The discussions were oriented towards the main goal: the philosophical
stance on this interdisciplinary endeavor. As in the case of the previous Jornadas
sobre Filosofía y Metodología actual de la Ciencia (Conferences on Present
Philosophy and Methodology of Science), the ninth edition of these meetings has
its central interest in the reflections developed nowadays.
Basically, every paper was focused in this direction, either in a clearly reflective
way or in a more active tendency. The conference had a central figure: Kristin
Shrader-Frechette, who tends towards the second disposition. She studied physics
at Xavier University (1966) and was later awarded a B. A. in mathematics by
Edgecliff College (1967). Thereafter, she prepared the dissertation in philosophy
of science at the University of Notre Dame (1972). This was followed by
postdoctoral work in several realms: community ecology (two years), economics
(one year) and hydrogeology (two years). She has held senior professorships at the
University of California and the University of Florida. Currently she is O’Neill
Family Professor of Philosophy and Concurrent Professor of Biological Sciences
at University of Notre Dame.
Kristin Shrader-Frechette has held fellowships in philosophy of science
from important entities: the Woodrow Wilson Foundation, the National Science
Foundation, the Carnegie Foundation … She was president of the committee for
science and ethics of the International Conference of Scientific Unions (1990-96).
Among her present activities is that of Editor –in– Chief of the Oxford University
Press monographs series “Environmental Ethics and Science Policy,” since
1988. In addition, she serves on the editorial board of Business Ethics Quarterly,
Encyclopedia of the Philosophy of Science, Humanities and Technology,
Philosophy and Technology, Public Affairs Quarterly, Synthese … She is article
referee for Economics and Philosophy, Philosophy of Science, Science…107 Before
107
About her academic career and her publications there is a detailed information in Kristin Shrader-
Frechette’s website: <www.nd.edu/~kshrader>.
36 Science, Technology and Society: A Philosophical Perspective
conference and this volume. I also thank the authors of the contributed papers
and the participants of the conference for their roles during those days. Last but
not least, I am grateful to José Fco. Martínez Solano for his assistance in editing
this volume.
7. BIBLIOGRAPHY
Within the vast literature on Science, Technology and Society or Science and
Technology Studies, the present bibliographical selection seeks to offer those titles
which might be useful as a road map of the field. Thus, it connects with the topics
of this chapter and completes the information given in the previous pages, but it
is not conceived as an exhaustive list of publications on STS, because the diversity
of studies on science and technology presumably requires a monographic volume
to display the bibliography. In this regard, and in tune with the characteristics of
this book, which are to contribute to the philosophical approach on STS, there is a
clear compatibility with other volumes on the different subject-matters of Science
and Technology Studies (practical ethics, policy analysis, legal studies, sociology
of science and sociology of technology, economics of science and economics
of technological change …). Furthermore, each one of the papers in this book
includes specific references to the topics with which they deal.
ACKERMANN, R., Data, Instruments and Theory, Princeton University Press,
Princeton, 1985.
ACHTERHUIS, H. (ed.), American Philosophy of Technology: The Empirical Turn,
translated by Robert Crease, Indiana University Press, Bloomington, 2001.
AGASSI, J., “Between Science and Technology,” Philosophy of Science, v. 47,
(1980), pp. 82-99.
AGASSI, J., “How Technology Aids and Impedes the Growth of Science,”
Proceedings of the Philosophy of Science Association, v. 2, (1982), pp. 585-597.
AGASSI, J., Technology, Reidel, Dordrecht, 1985.
AGAZZI, E., Il bene, il male e la scienza. Le dimensioni etiche dell’impresa
scientifico-tecnologica, Rusconi, Milan, 1992. Spanish translation by Ramón
Queraltó: El bien, el mal y la Ciencia. Las dimensiones éticas de la empresa científico-
tecnológica, Tecnos, Madrid, 1996.
BARNES, B., Scientific Knowledge and Sociological Theory, Routledge and K.
Paul, London, 1974.
BARNES, B., Interests and the Growth of Knowledge, Routledge and K. Paul,
London, 1977.
BARNES, B., T. S. Kuhn and Social Science, Macmillan, London, 1982 (Columbia
University Press, N. York, 1982).
BARNES, B., The Elements of Social Theory, Princeton University Press, Princeton,
1995.
38 Science, Technology and Society: A Philosophical Perspective
DYSON, A. and HARRIS, J., Ethics and Biotechnology, Routledge, London, 1993.
ECHEVERRIA, J., Ciencia y valores, Destino, Barcelona, 2002.
ECHEVERRIA, J., La revolución tecnológica, FCE, Madrid, 2003.
ELLIOTT, B., Technology, Innovation and Change, University of Edinburgh,
Edinburgh, 1986.
ELLUL, J., La technique; ou, L’en jeu du siècle, A. Colin, Paris, 1954 (2nd ed.
revised Economica, Paris, 1990.) Translated from the French by John Wilkinson with
an introd. by Robert K. Merton: ELLUL, J., The Technological Society, Alfred A.
Knopf, N. York, 1964.
ELSTER, J., Explaining Technical Change: a Case Study in the Philosophy of
Science, Cambridge University Press, Cambridge, 1983.
FEENBERG, A. (ed.), Technology and the Politics of Knowledge, Indiana University
Press, Bloomington, 1995.
FEIBLEMAN, J. K., Technology and Reality, M. Nijhoff, The Hague, 1982.
FELLOWS, R. (ed.), Philosophy and Technology, Cambridge University Press,
Cambridge, 1995.
FLORIDI, L. (ed.), Philosophy of Computing and Information, Blackwell, Oxford,
2004.
FREEMAN, C. and SOETE, L., Economics of Industrial Innovation, 3rd ed., The
MIT Press, Cambridge, 1997.
FULLER, S., Philosophy, Rethoric, and the End of Knowledge: The Coming of
Science and Technology Studies, University of Wisconsin Press, Madison, 1993.
GALISON, P., How Experiments End, The University of Chicago Press, Chicago,
1987.
GALISON, P., Image and Logic. A Material Culture of Microphysics, The University
of Chicago Press, Chicago, 1997.
GEHLEN, A., “Anthropologische Ansicht der Technik,” in FREYER, H., PAPALEKAS,
J. CH. and WEIPPERT, G. (eds.), Technik im technischen Zeitlater, J. Schilling,
Düsseldorf, 1965. Translated in abridged version as GEHLEN, A., “A Philosophical-
Anthropological Perspective on Technology,” in SCHARFF, R. C. and DUSEK, V. (eds.),
Philosophy and Technology: The Technological Condition, Blackwell, Oxford, 2003,
pp. 213-220.
GOLDMAN, S. L. (ed), Science, Technology, and Social Progress, Lehigh
University Press, Bethlehem (PA), 1989 (coedited by Associated University Presses,
London, 1989).
GONZALEZ, W. J., “Ámbito y características de la Filosofía y Metodología de
la Ciencia,” in GONZALEZ, W. J. (ed.), Aspectos metodológicos de la investigación
científica, 2nd ed., Ediciones Universidad Autónoma de Madrid and Publicaciones
Universidad de Murcia, Madrid-Murcia, 1990, pp. 49-78.
The Philosophical Approach to Science, Technology and Society 41
JENNINGS, R. E., “Truth, Rationality and the Sociology of Science,” The British
Journal for the Philosophy of Science, v. 35, (1984), pp. 201-211.
JASPERS, K., Die Atom-bombe und die Zukunft der Menschen, Piper, Munich,
1958.
JONAS, H., Das Prinzip Verantwortung. Versuch einer Ethik für die technologische
Zivilisation, Insel, Frankfurt am Main, 1979. Translated as JONAS, H., The Imperative
of Responsibility: In Search of an Ethics for the Technological Age, The University of
Chicago Press, Chicago, 1984.
K ITCHER, PH., The Advancement of Science: Science without Legend, Objectivity
without Illusions, Oxford University Press, N. York, 1993.
K ITCHER, PH., Science, Truth, and Democracy, Oxford University Press, Oxford,
2001.
K NORR-CETINA, K., The Manufacture of Knowledge. An Essay on the Constructivist
and Contextual Nature of Science, Pergamon Press, Oxford, 1981.
KOERTGE, N. (ed.), A House Built on Sand: Exposing Postmodern Myths about
Science, Oxford University Press, N. York, 1998.
KOERTGE, N., “’New Age’ Philosophies of Science: Constructivism, Feminism
and Postmodernism,” The British Journal for the Philosophy of Science, v. 51, (2000),
pp. 667-683.
KUHN, TH. S., The Structure of Scientific Revolutions, The University of Chicago
Press, Chicago, 1962 (2nd ed., 1970).
KUHN, TH. S., The Road Since Structure. Philosophical Essays, 1970-1993, with
an Autobiographical Interview, edited by James Conant and John Haugeland, The
University of Chicago Press, Chicago, 2000.
KUKLA, A., Social Constructivism and the Philosophy of Science, Routledge,
London, 2000.
LADRIERE, J., Les enjeux de la rationalité: le defí de la science et de la technologie
aux cultures, Aubier/Unesco, Paris, 1977. Translated as LADRIERE, J., The Challenge
presented to Culture by Science and Technology, Unesco, Paris, 1978.
LATOUR, B. and WOOLGAR, S., Laboratory Life: The Social Construction of
Scientific Facts, Princeton University Press, Princeton (NJ), 1979 (2nd edition,
1986.)
LATOUR, B., Science in Action: How to Follow Scientists and Engineers Through
Society, Harvard University Press, Cambridge (MA), 1987.
LATOUR, B., The Pasteurisation of France, Harvard University Press, Cambridge
(MA), 1988.
LATOUR, B., We have Never been Modern, Harvester, Brighton, 1993 (translated
by C. Porter).
LAUDAN, L., DONOVAN, A., LAUDAN, R., BARKER, P., BROWN, H., LEPLIN, J.,
THAGARD, P. and WYKSTRA, S., “Scientific Change: Philosophical Models and
Historical Research,” Synthese, v. 69/2, (1986), pp. 141-223.
44 Science, Technology and Society: A Philosophical Perspective
LAUDAN, L., Science and Relativism: Some Key Controversies in the Philosophy
of Science, The University of Chicago Press, Chicago, 1990.
LAUDAN, R. (ed.), The Nature of Technological Knowledge: Are Models of
Scientific Change Relevant?, Reidel, Dordrecht, 1984.
LELAS, S., “Science as Technology,” The British Journal for the Philosophy of
Science, v. 44, (1993), pp. 423-442.
LONGINO, H. E., Science as Social Knowledge: Values and Objectivity in Scientific
Inquiry, Princeton University Press, Princeton, 1990.
LUJAN, J. L., “Principio de precaución: Conocimiento científico y dinámica
social,” in ROMEO CASABONA, C. M. (ed.), Principio de precaución, Biotecnología y
Derecho, Comares/Fundación BBVA, Granada, 2004, pp. 221-234.
LUJAN, J. L. and ECHEVERRIA, J. (eds), Gobernar los riesgos. Ciencia y valores en
la Sociedad del riesgo, Biblioteca Nueva/OEI, Madrid, 2004.
LYNCH, M. and WOOLGAR, S. (eds.), Representation in Scientific Practice, The
MIT Press, Cambridge, 1990.
MACHAMER, P. K., “Las revoluciones de Kuhn y la Historia ‘real’ de la Ciencia: El
caso de la revolución galileana,” in GONZALEZ, W. J. (ed), Análisis de Thomas Kuhn:
Las revoluciones científicas, Trotta, Madrid, 2004, pp. 253-273.
MACPHERSON, C. B., “Democratic Theory: Ontology and Technology,” in
MITCHAM, C. and MACKEY, R. (eds.), Philosophy and Technology, The Free Press, N.
York, 1983, pp. 161-170.
MANNHEIM, K., Essays on the Sociology of Knowledge, Routledge and K. Paul,
London, 1952.
MARTINEZ SELVA, J. M., “Psicología del descubrimiento científico,” in GONZALEZ,
W. J. (ed.), Aspectos metodológicos de la investigación científica, 2nd ed., Ediciones
Universidad Autónoma de Madrid and Publicaciones Universidad de Murcia, Madrid-
Murcia, 1990, pp. 305-315.
MCMULLIN, E., The Social Dimensions of Science, Notre Dame University Press,
N. Dame (IN), 1992.
MICHALOS, A., “Technology Assessment, Facts and Values,” in DURBIN, P. and
R APP, F. (eds.), Philosophy and Technology, D. Reidel, Dordrecht, 1983, pp. 59-81.
MITCHAM, C., “Philosophy of Technology,” in DURBIN, P. (ed.), A Guide to the
Culture of Science, Technology and Medicine, The Free Press, N. York, 1980, pp.
282-363.
MITCHAM, C. and MACKEY, R. (ed.), Philosophy and Technology: Readings in the
Philosophical Problems of Technology, Free Press, N. York, 1983 (1ª ed., 1972).
MITCHAM, C. and HUNING, A., Philosophy and Technology II. Information
Technology and Computers in Theory and Practice, Reidel, Dordrecht, 1985.
The Philosophical Approach to Science, Technology and Society 45
Kristin Shrader-Frechette
Royal Dutch Shell discovered oil in the Niger River delta in 1958, and soon
after, it became the largest oil producer in Nigeria. The company received heavy
criticism because it provided oil revenues to the Nigerian military government
but not to the Ogoni tribe whose land and people have been destroyed by its
oil drilling.1 Even worse, Nigerian military officers said Shell put pressure on
the Nigerian government to clamp down on Ogoni people who protested Shell’s
lax environmental behavior. Thousands of Ogoni have been killed for doing
nothing more than engaging in nonviolent protests against the destruction of
their farmland and streams by uncontrolled oil spills, uncontrolled oil leaks, and
uncontrolled natural gas flaring.
1. K EN SARO WIWA
Nigerian writer Ken Saro-Wiwa –formerly a grocer, teacher, writer, and
television producer– criticized “the collusion of commercial [Shell] and military
[Abacha regime] force” responsible for destroying the Nigerian environment
and dehumanizing the Ogoni people. Although he had enough money to settle
comfortably and continue as a television producer and writer, Saro-Wiwa chose
instead to be an advocate and activist. He founded the non-violent human-rights
and environmental group, MOSOP (Movement for the Survival of the Ogoni
People); organized peaceful Ogoni protests; condemned Shell’s genocide; and
argued for cleanup. For his efforts, Saro-Wiwa won numerous international civic
and environmental awards. His son, a Nobel prize-winning author, Wole Soyinka,
is continuing his father’s human-rights efforts. But in spite of widespread protests
from the international community, in November 1995 the Nigerian military
government, dependent on Shell money, held a “kangaroo court,” dominated by
Shell lawyers, then hanged Saro-Wiwa and other nonviolent MOSOP advocates
and activists.2
*
Paper presented on March 12, 2004 in the Conference on Science, Technology and Society:
The Philosophical Perspective (Jornadas sobre Ciencia, Tecnología y Sociedad: La perspectiva
filosófica), organized by the University of A Coruña and the Society of Logic, Methodology and
Philosophy of Science in Spain.
1
Cf. BIELSKI, V., “Shell’s Game,” Sierra, v. 81, no. 2 (1996), pp. 30-36; WHEELER, D., “Blood on
British Business Hands,” New Statesman and Society, v. 8, no. 379 (1995), p. 14. See also MCLUCKIE,
C. W., Ken Saro-Wiwa, Writer and Political Activist, Lynne Rienner Publishers, Boulder, 1999.
“Cruelty Under the Microscope,” Economist, v. 357, no. 8197 (2000), p. 58; PEGG, S., “Ken Saro-
Wiwa,” Third World Quarterly, v. 21, no. 4, (2000), pp. 701-708; and DANIELS, A., “The Perils of
Activism: Ken Saro-Wiwa,” New Criterion, v. 18, no. 5, (2000), pp. 4-9.
2
Cf. WHEELER, D., “Blood on British Business Hands,” pp. 14-15; see BOYD, W., “Death of a Writer,”
The New Yorker, v. 71, no. 38, (1995), pp. 51-55. See “The Hidden Lives of Oil,” Chronicle of Higher
Education, v. 48, no. 30, (2002), pp. B7-B10.
51
52 Science, Technology and Society: A Philosophical Perspective
Brian Anderson, head of Shell Nigeria, told Saro-Wiwa’s brother that he could
save his brother’s life, provided Saro-Wiwa and MOSOP stopped nonviolent
protests against Shell. Saro-Wiwa and MOSOP refused. As a result, the military
government hanged the 9 nonviolent environmental activists.3
Shortly after the hangings, Shell had to hire 7 US public relations firms to
handle global protests of Shell’s and Nigeria’s behavior. Members of Britain’s
Royal Geographical Society voted to expel Shell as one of its sponsors because of
its Nigerian operations. And the 52-member British Commonwealth suspended
Nigeria and said that, in order to avoid expulsion, Nigeria would have to show
that it adhered to the human-rights principles of the group. Britain, the US, South
Africa, Germany, and Austria recalled their ambassadors to Nigeria in response
to the hangings. So did the 15 member nations of the European Union. The EU
suspended its development aid to Nigeria, and the World Bank rejected a $100
million loan to Nigeria. Also in response to Saro-Wiwa’s death, a huge coalition
of government, labor, human-rights, and NGO groups boycotted the Nigerian
military dictatorship. Shell Oil is still fighting in court to avoid paying damages
for the human and environmental problems it has caused in Nigeria.4 Ken Saro-
Wiwa, however, has won. He has brought international attention to the unjust
conditions Shell Oil imposed on his people.
Saro-Wiwa’s actions were ethically uncontroversial insofar as they were
nonviolent and insofar as virtually all western democracies in the world agreed
with his tactics and his stance. From the point of view of universalizability,
however, his actions are controversial because not everyone can be expected to
follow them. Is there a less controversial case of a person attempting to reform the
practice and use of science? A case that is more universalizable?
2. R ALPH NADER
Called the “modern-day champion of the little person,” a “male Jeanne d’Arc,”
and “the people’s lawyer”,5 Ralph Nader has spent his life working to reform
science and technology, to change the chronic violence of manufacturers against
people and the environment. For three decades, he has fired the opening guns
3
Cf. BIELSKI, V., “Shell’s Game,” pp. 30-36. See A INGER, K., “Interview with Owens Wiwa,” New
Internationalist, no. 351, (2002), pp. 33-34.
4
Cf. MITCHELL, J. G., “Memorial to a Warrior for the Environment,” National Geographic, v. 189,
no. 4 (1996), p. xxiv; M AYALL, J., “‘Judicial Murder’ Puts Democratic Values on Trial,” The World
Today, v. 51, no. 12, (1995), pp. 236-239; KUPFER, D., “Worldwide Shell Boycott,” The Progressive,
v. 60, no. 1 (1996), p. 13; A DAMS, P., “A State’s Well-oiled Injustice,” World Press Review, v. 43, no.
1 (1996), pp. 14-15; PYPKE, D., “Partners in Crime,” World Press Review, v. 43, no. 1, (1996), p. 16;
H ARINGTON, H., “A Continent’s New Pariah,” The Banker, v. 145, no. 838 (1995), pp. 63-64; BOYD,
W., “Death of a Writer,” pp. 51-55; K NOTT, D., “Shell the Target After Nigerian Executions,” Oil and
Gas Journal, v. 93, no. 47, (1995), p. 37; and A NDERSON, A., “A Day in the Death of Ideals,” New
Scientist, v. 148, no. 2005 (1995), p. 3. See LARSON, V., “Court Case Against Shell Can Proceed,”
World Watch, v. 15, no. 4 (2002), pp. 7-8.
5
Cf. GOREY, H., Nader and the Power of Everyman, Grosset and Dunlap, New York, 1975, pp. 147 and
176. See GOLDSMITH, Z., “Mr. Nader Goes to Washington,” Ecologist, v. 31, n. 1, (2001), pp. 26-27.
Objectivity and professional duties regarding science and technology 53
6
Cf. MCCARREY, C., Citizen Nader, Saturday Review Press, New York, 1972, pp. 29, 115, and 138;
SCARLOTT, J., “Ralph Nader,” in DELEON, D. (ed.), Leaders from the 1960s, Greenwood Press,
Westport (CT), 1994, p. 330; STEWART, T., “The Resurrection of Ralph Nader,” Fortune, May 22,
(1989), p. 106.
7
MCCARREY, C., Citizen Nader, p. 129; see also Citizen Nader, pp. 13ff., and 319.
8
Cf. MCCARREY, C., Citizen Nader, pp. 28, 44, and 196; SCARLOTT, J., “Ralph Nader,” in DELEON, D.
(ed.), Leaders, pp. 330-331; STEWART, T., “The Resurrection of Ralph Nader,” p. 106.
54 Science, Technology and Society: A Philosophical Perspective
Although Nader has been called “the single most effective antagonist of
American business”,9 he actually is a proponent of free enterprise. Nader argues
that corporate abuses are possible only when market competition is not informed
and open. US chemical manufacturers, for instance, are able to sacrifice consumer
safety for higher profits only when the people have neither information about toxins
nor alternatives to their use. Whenever consumers enjoy both full information and
open competition, Nader says government regulation is unnecessary. Regulation,
he claims, can promote monopolies that ultimately threaten consumer interests.
Government regulation has given US utilities monopolistic control that has
enabled them to avoid clean-energy technologies and to promote dirty ones, like
nuclear power. As the antidote for such dominance of special interests, Nader
promotes widespread citizen action and informed, open competition.10
In taking uncompromising public-interest positions, Nader admits he is not
neutral. Washington, he says, doesn’t give one “the luxury of dealing in shades of
gray.” He claims some issues are black and white because the stakes are high.11
Nader’s black-and-white approach raises an important question. Do informed,
critical, attempts at reforming science and technology compromise objectivity?
Nader says they do not. He claims that all citizens, and especially professionals,
should nurture “conscience and competence.... an obligation to advance or protect
the general interest.” 12 Do citizens, especially professionals, have a duty to be
public-interest advocates, as Nader suggests? Or should they remain neutral, in
the name of objectivity?
the common good, injustices might remain uncriticized. Plato spoke out against
the civic ills of his time. He realized that participation in current ethical and
political arguments promoted both personal growth and better public policies.15
And John Locke criticized the alleged divine right of kings. When he argued
instead for democracy based on the consent of the people he was hunted down for
treason. In a letter to Norman Malcolm, Ludwig Wittgenstein made professional
duties clear: “What is the use of studying philosophy if all that it does for you is
to enable you to talk with some plausibility about some obtuse questions of logic,
etc. and if it does not improve your thinking about the important questions of
everyday life?.” 16
One of the reasons citizens, professionals, and scholars often fail to help
reform science and technology, and therefore to act as advocates for the public
interest, is that they accept an erroneous model of objectivity. According to this
positivistic model, people are “objective” when they take no stances and remain
completely neutral, as Moore and Hare advised. A corollary of this position is that
whenever people’s words or actions are not completely neutral, they are biased in
a reprehensible way. According to this position, any sort of advocacy or activism,
even in the name of the public interest, is evidence of prejudice. On the contrary,
according to the model of objectivity this essay defends, all people, especially
professionals, sometimes have duties to be advocates for the common good. They
should not always remain merely passive observers of society, in part because
genuine objectivity often requires advocacy or criticism.17 Besides, if Quine,
Kuhn, Kitcher, and hosts of others are correct, no claim can be neutral in the sense
of being wholly free of evaluative inferences. And if not, then although some
claims are more objective (less biased) than others, none are completely value-
free. Some people erroneously believe there are neutral or value-free positions,
perhaps because they fail to distinguish among different types of values, only
some of which reflect bias. Because some kinds of value judgments underlie all
claims, even in science, people have duties to avoid only the value judgments that
are both biased and avoidable. But which are biased and avoidable?
On Longino’s classification, there are three types of value judgments-
bias, contextual, and constitutive-and they are neither mutually exclusive nor
exhaustive. Bias values occur whenever people deliberately misinterpret or omit
something so as to serve their own purposes. Obviously people always can and
ought to avoid all bias values. Contextual values are more difficult to escape. They
include personal, social, cultural, or philosophical emphases. Scientists employ
contextual values if financial constraints force them to use particular methods
15
Cf. MIDGLEY, M., Wisdom, Information, and Wonder, pp. 93, 99 and 106. See also FARRELLAJ, C.,
“Public Reason, Neutrality and Civic Virtues,” Ratio-Juris, v. 12, no. 1, (1999), pp. 11-25.
16
MIDGLEY, M., Wisdom, Information, and Wonder, p. 239.
17
Some of the analysis in this chapter relies on SHRADER-FRECHETTE, K., Risk and Rationality,
University of California Press, Berkeley, 1991, ch. 4, pp. 53-65. See also SHER, G., Beyond Neutrality:
Perfectionism and Politics, Cambridge University Press, New York, 1997.
56 Science, Technology and Society: A Philosophical Perspective
or data rather than others. Contextual values might lead scientists to accept old
data rather than to generate new information. Although in principle it sometimes
is possible to avoid contextual values, in practice it is difficult to do so, because
context influences everyone. Korenbrot, for instance, showed that the contextual
value of limiting population growth has influenced many medical researchers who
have overemphasized the benefits of oral contraceptives and underestimated the
risks. Contextual values, such as the profit motive, heavily influence science, in
part, because any research or belief is hampered by incomplete information. Facing
an unavoidable data gap, people must use contextual value judgments to bridge the
gap,18 or avoid all judgments based on incomplete information or induction.
Constitutive or methodological values are even more difficult to avoid because
they are necessary in choosing one method or rule of inference rather than
another. Scientists collecting data must make value judgments about what data
to gather, what to ignore, how to interpret observations, how to avoid erroneous
interpretations. These constitutive value judgments are essential, even to pure
science, because human perception does not provide people with pure facts.
Instead, beliefs and values (that people already hold) play a key part in providing
categories for interpreting observations. High-energy physicists, for example, do
not count all the marks on their cloud-chamber photographs as observations of
pions. They count only those streaks that their beliefs indicate are pions. Just
as social, political, and economic contexts frequently frame beliefs, so also do
scientific and logical methods. Methodological values unavoidably structure
all knowing because there is no complete separation between facts and values,
and all facts are laden (at least) with some methodological values.19 If facts and
values were separable, it would be impossible to develop theories or to explain
causal connections among phenomena. Because methodological values influence
what people see and how they see it, such values are not avoidable and, at best,
people can make only better, rather than worse, methodological value judgments.
Although people can avoid all bias values,20 deliberate misinterpretations and
omissions,21 they cannot avoid methodological values.
18
Cf. LONGINO, H., Science as Social Knowledge, Princeton University Press, Princeton, 1990.
See SCOTT, P., “Captives of Controversy: The Myth of the Neutral Social Researcher,” Science,
Technology, and Human Values, v. 15, no. 1, (1990), pp. 474-494, and BORNSTEIN, R., “Objectivity
and Subjectivity in Psychological Science,” Journal of Mind and Behavior, v. 20, no. 1, (1999), pp.
1-16.
19
For discussion of relevant examples from the history of science, see BROWN, H. I., Perception,
Theory and Commitment, The University of Chicago Press, Chicago, 1977, pp. 97-100 and 147,
and SHRADER-FRECHETTE, K., “Recent Changes in the Concept of Matter: How Does ‘Elementary
Particle’ Mean?,” in ASQUITH, P. D., and GIERE, R. N. (eds.), Philosophy of Science Association 1980,
v. 1, Philosophy of Science Association, East Lansing, 1980, pp. 302-312.
20
Cf. MIDGLEY, M., Wisdom, Information, and Wonder, pp. 80-81.
21
See BEVIR, M., “Objectivity in History,” History and Theory, v. 33, no. 3, (1994), pp. 328-344. See
LONGINO, H., Science as Social Knowledge: Values and Objectivity in Scientific Inquiry, pp. 109-
121. See also SHRADER-FRECHETTE, K., Science Policy, Ethics, and Economic Methodology, Reidel,
Dordrecht, 1984, p. 73, and SHRADER-FRECHETTE, K., Risk and Rationality, pp. 40-44. See ELLIS,
Objectivity and professional duties regarding science and technology 57
4. OBJECTIVITY
But if avoiding methodological values is impossible, then public-interest
advocates have no duty to do so. But not all methodological value judgments are
created equal. Some are more objective and defensible than others. Although all
values are partially subjective in the sense that none can be empirically confirmed,
not all are subjective in a reprehensible way, because not all values are biased or
arbitrary. Conceptual and logical values, like explanatory or predictive power,
may help guarantee objectivity. Just as there are good reasons, short of empirical
confirmation, for accepting one belief over another, so also there are good reasons
for accepting one value judgment over another.22
If not all value judgments are subjective in a reprehensible way, and if some
values are better than others, then advocating better values, is defensible on
epistemological and ethical grounds.23 Yet many people wrongly assume that all
advocacy entails bias. If it did, then any criticisms (whether of unjust ethical
positions or faulty science) would be biased. But if any criticisms were biased, then
one would have to avoid criticism of things such as heinous crimes or irrational
inferences. Obviously it makes no sense to avoid such criticisms. And if not,
criticism or advocacy need not involve bias.24 In fact, criticism or advocacy may
be the only way to avoid bias. If it is impossible to avoid some value judgments,
even in science, then people who do not criticize poor judgments merely endorse
whatever values are dominant. Such people –who believe that objectivity requires
complete neutrality– also err through inconsistency; they implicitly sanction
status-quo values if they are silent about questionable value judgments, yet they
explicitly condemn work that sanctions value judgments.
Remaining neutral, in the face of flawed beliefs, also jeopardizes objectivity
as well as consistency. Suppose a special-interest group uses largely political
reasons for accepting a particular belief. A nuclear utility might employ unrealistic
assumptions about future energy demand in order to argue for building breeder
reactors. Not to criticize such unrealistic assumptions or value judgments is wrong,
because not all assumptions about future electricity use are equally correct. It is
more reasonable to assume higher energy costs will reduce, rather than increase,
demand. And if so, the most objective thing to do, in the presence of questionable
public-policy assumptions, is to be critical of them, not to remain neutral.
22
Cf. SHRADER-FRECHETTE, K., Science Policy, Ethics, and Economic Methodology, pp. 73-74;
SHRADER-FRECHETTE, K., “Scientific Method and the Objectivity of Epistemic Value Judgments,” in
FENSTAD, J., HILPINEN, R. and FROLOV, I. (eds.), Logic, Methodology, and the Philosophy of Science,
Elsevier, New York, 1989, pp. 373-389. See also GRUENDER, D., “Values and Philosophy of Science,”
Protosociology, v. 12, (1998), pp. 319-332.
23
Cf. SHRADER-FRECHETTE, K., Science Policy, Ethics, and Economic Methodology, p. 183. See
AUDI, R., The Structure of Justification, Cambridge University Press, New York, 1993.
24
Cf. M ARGOLIS, J., “On the Ethical Defense of Violence and Destruction,” in HELD, V., NIELSEN, K.,
and PARSONS, C. (eds.), Philosophy and Political Action, Oxford University Press, New York, 1972,
pp. 52-71. See also ENNIS, R. H., “Is Critical Thinking Culturally Biased?,” Teaching Philosophy, v.
21, no. 1, (1998), pp. 15-33.
58 Science, Technology and Society: A Philosophical Perspective
On one hand, many ethical relativists deny there is any objectivity, and they
overemphasize the value judgments in knowledge. They reduce all knowledge
claims merely to social constructs. On the other hand, many naive positivists
–and other proponents of a sharp distinction between facts and values–
underemphasize values. They reduce all knowledge claims to factual or logical
truths. They ignore the evaluative aspects of knowing. A more plausible account
of objectivity falls midway between the views of the cultural relativists and those
of the naive positivists. According to the middle view defended here,25 objectivity
is not tied to freedom from all values but to freedom from bias values. It is tied to
fair and even-handed representation of the situation.
If these arguments are correct, then objectivity, as freedom from bias, often
requires advocacy for correct or less biased positions. Because people can be
blamed for their failure to be objective, or unbiased, it must be possible for people
to be more or less objective. But how might people recognize objectivity in a
given situation? In the most minimal sense, beliefs or positions are objective and
avoid bias if they survive the criticisms of those knowledgeable about them and
potentially affected by them.26
It seems reasonable to define “objectivity” in terms of surviving such
criticism because criticism need not be subjective. For example, when people
make methodological value judgments about which of two environmental risk
probabilities is more accurate, they are not talking merely autobiographically or
subjectively. They are making claims about characteristics of external events that
other people are capable of knowing. Moreover, the skills associated with making
these judgments are a function of experience, education, and intelligence. But if so,
at least three reasons suggest objectivity does not require having either an algorithm
or empirical data guaranteeing the correctness of the resulting judgments. First,
empirical factors (such as actual accident frequencies) could anchor objectivity and
change the correctness of judgments about risk. Second, ethical factors could anchor
objectivity, because people have duties to follow their contracts and to treat others
consistently. Third, explanatory power could anchor objectivity, because reasonable
people usually accept beliefs as objective if they are able to explain problems and
25
Cf. SHRADER-FRECHETTE, K., Risk and Rationality, ch. 4, pp. 53-65.
26
See HEMPEL, C. G., “Scientific Rationality: Analytic vs. Pragmatic Perspectives,” in GERAETS,
T. S. (ed.), Rationality To-Day, University of Ottawa Press, Ottawa, 1979, p. 56; HEMPEL, C. G.,
“Valuation and Objectivity in Science,” in COHEN, R. and LAUDAN, L. (eds.), Physics, Philosophy,
and Psychoanalysis, Reidel, Dordrecht, 1983, p. 91; MCMULLIN, E., “Values in Science,” in ASQUITH,
P. (ed.), Philosophy of Science Association 1982, v. 2, Philosophy of Science Association, East
Lansing, 1983; SELLARS, W., Philosophical Perspectives, C. Thomas, Springfield (IL), 1967, p. 410;
and SMITH, M., “Objectivity and Moral Realism,” in H ALDANE, J. and WRIGHT, C. (eds.), Reality,
Representation, and Projection, Oxford University Press, New York, 1993, pp. 235-256.
Moreover, according to the standard version of “discourse ethics,” the objectivity of moral norms
resides in their intersubjective acceptability under idealized conditions of discourse. See A PEL, K.-
O., Towards a Transformation of Philosophy, trans. G. Adey and D. Frisby, London, Routledge,
1980; and H ABERMAS, J., “What is Universal Pragmatics?,” in H ABERMAS, J., Communication and
the Evolution of Society, trans. T. McCarthy, Beacon, Boston, 1979, pp. 1-68, see note 24.
Objectivity and professional duties regarding science and technology 59
appropriate at different levels of generality, such that the most general rules are
the most certain and the most universal (such as “postulate risk probabilities
consistent with observed accident frequencies,” or “do good and avoid evil”). The
least general rules or value judgments are the least certain and the least universal
(such as “person x errs in killing her attacker under circumstances y”).34 In
order to apply the rules from the most universal, most general level, people must
make a number of value judgments at lower levels. The fact that there is neither
an algorithm nor empirical data (for these judgments) does not mean they are
purely relative. Some are better than others. Some are better means to the end of
explanatory power or predictive control.
The cultural relativists and positivists who oppose all public-interest advocacy
miss both these points because they appear to presuppose that value judgments
ought to be infallible, rather than prima facie true. Understanding objectivity in
terms of prima facie truth requires (in part) understanding it in terms of some
insights of Karl Popper, John Wisdom, and Ludwig Wittgenstein.35 They anchor
objectivity with actions, as well as with explanatory and predictive power. They
do not define objectivity in terms of an impossible notion of justification. They
secure objectivity, in part, by means of the criticisms made by the relevant
knowledge communities. According to this scheme, a value judgment about the
safety of some food additive, for example, is objective if it is able to survive
and answer the criticisms of those informed about, and potentially affected by,
the additive.36 This social and critical account of knowing presupposes that
objectivity, in its final stages, requires people to appeal to particular cases, just as
legal justification requires. This account does not presuppose an appeal to specific
rules of knowing, applicable to all situations. Nevertheless the general rules (such
as surviving criticism) always are applicable. A naturalistic appeal to general
rules, to cases, and to general values (such as consistency and predictive power),
rather than to specific rules, is central to this social account of knowing. Instead
of specific rules, applicable to all cases, the relevant community of knowers must
evaluatively determine which judgments are objective.
As Mary Midgley recognized, what constitutes bias is not the acceptance of
one’s own scheme of values but the refusal to look at anyone else’s.37 Because
knowing takes place within a varied community of knowers, having a multiplicity
34
See H ARE, R. M., Moral Thinking: Its Levels, Methods and Point, Oxford University Press,
Oxford, 1981.
35
See NEWELL, R., Objectivity, Empiricism, and Truth, Routledge and K. Paul, New York, 1986,
notes 82-84, 86 and 89; STORTLAND, F., “Wittgenstein: On Certainty and Truth,” Philosophical
Investigations, v. 21, no. 3, (1998), pp. 203-331; HULL, D. L., “The Use and Abuse of Karl Popper,”
Biology and Philosophy, v. 14, no. 4, (1999), pp. 481-504; and ZECHA, G., Critical Rationalism and
Education Discourse, Amsterdam, Rodopi, 1999.
36
See R EALE, M., “Axiological Invariants,” Journal of Value Inquiry, v. 29, no. 1, (1995), pp. 65-75;
and K ITCHER, PH., “The Division of Cognitive Labor,” The Journal of Philosophy, v. 77, n. 1, (1990),
pp. 5-22.
37
MIDGLEY, M., Wisdom, Information, and Wonder, p. 176.
62 Science, Technology and Society: A Philosophical Perspective
of different values, public-interest advocacy should help emphasize the social and
critical nature of knowledge.38 It should help people recognize that an unbiased
individual knower may be an inadequate focus for objective understanding.39
Rather, knowledge and objectivity is achieved because members of the varied
community of knowers interact and clarify issues. Each of the members’ social
contexts provides many categories and assumptions that enable people to interpret
and correct understanding of phenomena.40 Because any single observation is
“always selective,” the best way to be objective is to multiply standpoints, to
“increase experience,” to adopt a critical attitude, and to be ready to modify
views on the basis of criticism and interaction.41 But if so, people ought not to
neglect the alternative standpoints of various members of the relevant knowledge
community-including women, minorities, environmental stakeholders, or
oppressed people. Otherwise knowers could fall victim to the dogmatism of a
selective standpoint.
As John Stuart Mill recognized, the surest way of getting to the truth on
any question is to examine all the important objections that can be brought
against candidate opinions and alternative standpoints.42 Such a multi-faceted
and critical approach to knowing requires a community of knowers, each with
somewhat different standpoints and advocacies. It requires a “free discussion” of
views, giving assent only to those positions that survive critical evaluations from
alternative standpoints. As Philip Kitcher put it, knowing requires a “division
of cognitive labor” among knowers, a community whose existence suggests the
inadequacy of privileging any particular observer as alone “objective.” 43
Part of what is wrong with those who reject attempts to reform science and
technology is their failure to tie objectivity to evenhandedness and lack of bias. A
standpoint can be classified as “objective” only when it meets at least two criteria.
38
Some of this discussion of the social nature of knowing is based on SHRADER-FRECHETTE, K.,
“Feminist Epistemology and its Consequences for Policy,” Public Affairs Quarterly, v. 9, no. 2,
(1995), pp. 155-174.
39
Cf. LONGINO, H., “Multiplying Subjects and the Diffusion of Power,” The Journal of Philosophy, v.
88, no. 11, (1991), pp. 666-674; and LONGINO, H., Science as Social Knowledge, pp. 109-121. See also
TUANA, N., “The Radical Future of Feminist Empiricism,” Hypatia, v. 7, no. 1 (1992), pp. 100-114.
40
Cf. KUHN, TH. S., The Structure of Scientific Revolutions, The University of Chicago Press, Chicago,
2nd ed., 1970, pp. 91-204. See POLANYI, M., Personal Knowledge, Harper and Row, New York, 1964;
and H ANSON, N. R., Patterns of Discovery, Cambridge University Press, Cambridge, 1958.
41
Cf. POPPER, K. R., “Science: Conjectures and Refutations,” in FETZER, J. H. (ed.), Foundations
of Philosophy of Science, Paragon House, New York, 1993, pp. 341-363, especially, pp. 350-352.
See POPPER, K., The Logic of Scientific Discovery, p. 106; and POPPER, K. R., Conjectures and
Refutations, especially ch. 11, pp. 253-292. See also H ACOHEN, M. H., Karl Popper, Cambridge
University Press, Cambridge (MA), 2000.
42
Cf. MILL, J. S., On Liberty, Prometheus, Buffalo (NY), 1986, pp. 60-61.
43
Cf. K ITCHER, PH., “The Division of Cognitive Labor,” The Journal of Philosophy, v. 77, no. 1,
(1990), pp. 5-22. See POPPER, K. R., “Science: Conjectures and Refutations,” in FETZER, J. H. (ed.),
Foundations of Philosophy in Science, p. 354; and POPPER, K. R., The Logic of Scientific Discovery,
p. 106.
Objectivity and professional duties regarding science and technology 63
(1) It survives criticism and testing by members of the relevant communities, and
(2) it is consistent with democratic and procedural constraints such as fairness
and evenhandedness.44 Such a notion of objectivity and defensible objectivity in
reforming science and technology must be procedural, open, and populist. On
this account, what a pluralistic community of public-interest advocates and critics
ought to believe is bootstrapped onto how they ought to act. People ought to act in
ways that evenhandedly evaluate and predictively test all relevant perspectives,
including those of women, children, minorities, environmentalists, industrialists,
and so on. Such unbiased actions are necessary for objective knowing, and
objective knowing (in the sense defined here) helps provide a reliable foundation
for public-interest advocacy and criticism.
If this account of the social nature of knowing is correct, it provides important
ethical reasons for public-interest advocates to consider the beliefs of all relevant
members of the moral community. Because of its inclusiveness, this social account
requires people to use the marketplace of ideas to analyze, defend, and criticize
alternative positions. This is one of the surest ways to know as objectively as
possible.45 Professionals interested in reforming science and technology need to
secure objectivity in part procedurally, by means of the interactions and criticisms
of the relevant community of knowers and by tests for fairness and lack of bias.
Those who ignore relevant criticisms are guilty of bias because objective knowing
requires consideration of a variety of relevant standpoints and practices.
Failure to define “objectivity” accurately also may keep people from accepting
their ethical responsibilities, including those to reform science and technology.
Those who fail to behave as “public citizens” may fail, in part, because they miss
the basic insight of Israel Scheffler: “objectivity requires simply the possibility of
intelligible debate over the merits of rival paradigms.” 46 If this is all objectivity
requires, then it is time for citizens, scholars, and other professionals to enter
public debates.47
44
See A DLER, J., “Reasonableness, Bias, and the Untapped Power of Procedure,” Synthese, v, 94, no.
1, (1993), pp. 105-125; and WILLIAMS, B., Ethics and the Limits of Philosophy, Harvard University
Press, Cambridge, 1985, pp. 199-200. See A NDREWS, R. N., “Environmental Impact Assessment and
Risk Assessment,” in WATHERN, P. (ed.), Environmental Impact Assessment, Unwin Hyman, London,
1988, pp. 85-97; and COX, L. and R ICCI, P., “Legal and Philosophical Aspects of Risk Analysis,” in
PAUSTENBACH, D. J. (ed.), The Risk Assessment of Environmental and Human Health Hazards, J.
Wiley, New York, 1989, pp. 1017-1046, for suggestions in this regard.
45
See, for example, POPPER , K. R., The Open Society and Its Enemies, pp. 403-406; POPPER , K.
R., Conjectures and Refutations, p. 63; and M ASO, I. (ed.), Openness in Research, Van Gorcum,
Assen, 1995.
46
SCHEFFLER, I., “Vision and Revolution: A Postscript on Kuhn,” Philosophy of Science, v. 39, no.
3, (1972), p. 369.
47
Some of this discussion of objectivity, speaking out, and ethical tests for objectivity relies on
SHRADER-FRECHETTE, K., Ethics of Scientific Research, Rowman and Littlefield, Savage (MD),
1994, pp. 55-61. See MOORE, A. W., “One or Two Dogmas of Objectivity,” Mind, v. 108, no. 430,
(1999), pp. 381-393. See also A LCOFF, L. M., “Objectivity and Its Politics,” New Literary History, v.
32, no. 4, (2001), p. 835.
64 Science, Technology and Society: A Philosophical Perspective
5. OBJECTIONS
In response to this account of duties to reform science and technology, critics
could object: (1) Reformers could err and thus contribute to faulty policy. (2)
Without neutrality, mere politics could control science and policy.
As objection (1) suggests, not all reform attempts are ethically and practically
defensible.48 If scientists err when they speak out against some technological
hazard, they could jeopardize both scientific credibility and sound policy.49
Despite Nader’s outstanding accomplishments, a congressional-committee chair
claimed that he was “a bully and know-it-all, consumed by certainty and frequently
in error.” 50 Daniel Simberloff, a distinguished biologist, refers to this objection
(1) when he worries that if scientists err in their advocacy, future policymakers
might not listen to them.51 Contrary to Simberloff, however, sometimes
professionals ought to take the risk of attempting reform, in part because their
making mistakes rarely leads to loss of credibility. When researchers disproved
the scientific foundations of the Endangered Species Act, the diversity-stability
thesis,52 lawmakers did not repeal it. And when Dutch researchers showed that
the US Rasmussen Report, WASH-1400,53 was wrong, that all the accident failure
–frequency values from operating experience fell outside the study’s 90– percent
confidence bands,54 nations did not close their nuclear plants.
When Cal Tech founder and Nobel winner, Robert Millikan,55 called belief
in nuclear power a “myth,” less than a decade before scientists confirmed
the existence of fission energy, Millikan did not lose credibility. If the work
of Kahneman, Tversky, and others is correct, experts chronically err, even in
their own fields of expertise, when they reason probabilistically. In employing
48
See SHRADER-FRECHETTE, K., Ethics of Scientific Research, pp. 130-133.
49
Cf. SIMBERLOFF, D., “Simplification, Danger, and Ethics in Conservation Biology,” Ecological
Society of America Bulletin, v. 68, (1987), pp. 156-157. See also VESILIND, P. A. and BARTLETT, L.,
“The Ethics and Science of Environmental Regulation,” Journal of Environmental Engineering, v.
124, no. 8, (1998), p. 675.
50
THOMAS, R., “Safe at This Speed?,” p. 40.
51
Cf. SIMBERLOFF, “Simplification, Danger, and Ethics in Conservation Biology,” p. 157.
52
See SHRADER-FRECHETTE, K. and MCCOY, E. D., Method in Ecology, Cambridge University
Press, Cambridge, 1993, ch. 2. See, for example, US CONGRESS, Congressional Record Senate, 93rd
Congress, First Session 119, US Government Printing Office, Washington, DC, 1973, p. 25668;
COMMONER, B., The Closing Circle, Knopf, New York, 1971, p. 38; and MYERS, N., A Wealth of Wild
Species, Westview Press, Boulder (CO), 1983.
See also R EICHHARDT, T., “Academy Backs Science in Endangered Species Act,” Nature, 375, no.
6530, (1995), p. 349; NOSS, R.F., The Science of Conservation Planning, Habitat Conservation under
the Endangered Species Act, Island Press, Washington, 1997; NATIONAL R ESEARCH COUNCIL, Science
& the Endangered Species Act, National Academy Press, Washington, 1995; and R EICHHARDT, T.,
“Inadequate Science in US Habitat Plans,” Nature, v. 397, no. 6717, (1999), p. 287.
53
US NUCLEAR R EGULATORY COMMISSION, Reactor Safety Study, NUREG 75/014, WASH-1400, US
Government Printing Office, Washington, DC, 1975.
54
Cf. COOKE, R. M., “Problems with Empirical Bayes,” Risk Analysis, v. 6, no. 3, (1986), pp. 269-
272; see SHRADER-FRECHETTE, K., Risk and Rationality, pp. 109-111, 140-144 and 188-196.
55
Cf. M ILLIKAN, R. A., “Alleged Sins of Science,” Scribner’s Magazine, v. 87, no. 2, (1930), pp.
119-130.
Objectivity and professional duties regarding science and technology 65
56
Cf. K AHNEMAN, D. and TVERSKY, A., “Availability: A Heuristic for Judging Frequency and
Probability,” in K AHNEMAN, D. H. ET AL. (eds.), Judgment Under Uncertainty: Heuristics and
Biases, Cambridge University Press, Cambridge, UK, 1982, pp. 63-78; K AHNEMAN, D. and TVERSKY,
A., “Judgment Under Uncertainty,” in K AHNEMAN, D. H. ET AL. (eds.), Judgment Under Uncertainty,
pp. 4-11.
57
Cf. R ASMUSSEN, N. C., “Methods of Hazard Analysis and Nuclear Safety Engineering,” in MOSS,
T. and SILL, D. (eds.), The Three Mile Island Nuclear Accident, New York Academy of Science, New
York, 1981, pp. 56-57.
58
Cf. US CONGRESS, Government Liability for Atomic Weapons Testing Program, Hearings
before the Committee on the Judiciary, US Senate June 27, 1986, US Government Printing Office,
Washington, DC, 1987. See also US CONGRESS, Cold War Human Subject Experimentation, Hearing
before the Legislation and National Subcommittee of the Committee on Government Operations,
House of Representatives, One Hundred Third Congress, second session, 28 September, 1994,
Government Printing Office, Washington, DC, 1994; US DEPARTMENT OF ENERGY, Human Radiation
Experiments: The Department of Energy Roadmap to the Story and the Records, National Technical
Information Service, DOE/EH-0445, Springfield (VA), 1995; US CONGRESS, Human Subjects
Research: Radiation Experimentation, Hearings before the Committee on Labor and Human
Resources, United States Senate, One Hundred and Third Congress, first session, 13 January,
1994, US Government Printing Office, Washington, DC, 1994; US CONGRESS, American Nuclear
Guinea Pigs: Three Decades of Radiation Experiments on US Citizens, A report prepared by the
Subcommittee on Energy Conservation and Power of the Committee on Energy and Commerce, US
House of Representatives, US Government Printing Office, Washington, DC, 1986.
59
See, for example, MCCARREY, C., Citizen Nader, pp. 12, 13, 139, 212 and 213; GOREY, H., Nader
and the Power of Everyman, p. 23; ROWE, J., “Ralph Nader Reconsidered,” in SCARPITTI, F. and
CYLKE Jr., F. (eds.), Social Problems: The Search for Solutions: An Anthology, Roxbury Publishing
Company, 1st ed., 1995, p. 65.
66 Science, Technology and Society: A Philosophical Perspective
fail to “get involved” by attempting to reform science and technology, there is less
chance of avoiding injustice and resolving public controversies. As one scholar
put it, “contemporary Pyrrhonism cannot sustain serious moral conflict.” 66 In
the face of serious evil, if people adopt positions of neutrality, then they are not
neutral. Instead they contribute to evil by helping it to continue, while they claim
to be neutral. In summary, at least 6 reasons show that objectivity is not neutrality
and therefore that informed, balanced attempts to reform science and technology
can be objective:
(1) Once people admit that value judgments are part of all knowing, then not
to assess those value judgments is to become hostage to them.
(2) If not all positions are equally justifiable, then objectivity requires people
to represent less justifiable positions as less justifiable.
(3) In the face of a great threat, people who represent objectivity as neutrality
serve the interests of those responsible for the threat.
(4) People who represent objectivity as neutrality encourage lack of attention
to evaluative assumptions and thus lack of public control over those
assumptions.
(5) People who represent objectivity as neutrality presuppose that it is
somehow delivered from “on high,” rather than discovered socially,
through the give-and-take of alternative points of view
(6) People who represent objectivity as neutrality sanction either ethical
relativism or skepticism and thus encourages injustice.
Some members of at least three groups in contemporary society (post-
modernists, positivists, and relativist social scientists) likely would support the
Columbia University anthropologists who remained neutral toward Hitler. They
would be undercut by the arguments I have made because members of all three
groups confuse objectivity with silence or neutrality. They confuse tolerance
with ethical relativism.
that could threaten the common good. In many fields having consequences for
the common good –for example, chemical research– profits often interfere with
objective fact-finding. Because members of democratic or biotic communities are
less able (than individual business clients) to give free informed consent to risky
actions affecting them, citizens need to help assure these members of greater
protection. One way to promote such protection would be for professionals to help
improve standards of peer review for research that affects the public welfare.67
They also could work to respond, especially in the popular press, to science that is
biased and manipulated by vested interests and to eliminate the biases that often
accompany professional work.
Asserting citizens’ (and especially professionals’) responsibilities –to promote
unbiased information affecting the common good– is analogous to affirming
similar duties regarding dangerous technologies. Just as there is a justifiable
double standard (based on the gravity of the public threat) for speaking out
against biased information, there also is a justifiable double standard (based on
the severity of the public-health risk) for criticizing dangerous technologies. In
both cases, human responsibility for counteracting the threat is proportional to its
seriousness. This proportionality explains the reason that professionals typically
ought to have a higher standard for assessing more hazardous technologies, like
nuclear power. Because situations of greater threats require greater scrutiny,
riskier technologies ought to have greater counterbalancing benefits.68
In other words, objectivity often is a matter of ethics as well as epistemology.
Epistemic objectivity addresses beliefs. It requires citizens and professionals to
assess hypotheses and their practical consequences in ways that avoid deliberate
bias or misinterpretation. Ethical objectivity addresses actions. It requires more
than merely avoiding deliberate bias or misinterpretation. Instead it demands that
citizens take into account obligations to the common good when assessing their
actions, omissions, and beliefs. For example, reformers might follow epistemic
objectivity and assess whether an hypothesis (such as “this biotechnological
experiment will not endanger ecosystems”) is both probable and likely to lead
to no undesirable consequences. Following a principle of ethical objectivity, they
might evaluate whether epistemic objectivity alone provides an adequate test of
the hypothesis or whether one also ought to consider factors like the public’s rights
to protection and its rights to know about potentially harmful acts. In cases that
involve duties to stakeholders, objectivity requires not merely unbiased epistemic
assessment of one’s beliefs, but also unbiased ethical evaluation of the actions
67
See, for example, LLOYD, J., “On Watersheds and Peers, Publication, Pimps and Panache,” Florida
Entomologist, v. 68, (1985), pp. 134-139; and HOLLANDER, R., “Journals Have Obligations, Too,”
Science, Technology, and Human Values, v. 15, no. 1, (1990), pp. 46-49.
68
See, for example, STARR, C., RUDMAN, R., and WHIPPLE, C., “Philosophical Basis for Risk
Analysis,” Annual Review of Energy, v. 1, (1976), p. 638. See also BAYLES, M., Professional Ethics,
Wadsworth, Belmont, 1981, p. 116; TVERSKY, A. and FOX, C. R., “Weighing Risk and Uncertainty,”
Psychological Review, v. 102, no. 2, (1995), p. 269; and VAN R AAIJ, W. F., “The Life and Work of
Amos Tversky,” Journal of Economic Psychology, v. 19, no. 4, (1998), p. 515.
Objectivity and professional duties regarding science and technology 69
Cf. CLIFFORD, W. K., Lectures and Essays, Macmillan, London, 1886. See also TOULMIN, S., “Can
69
Science and Ethics Be Reconnected?,” Hastings Center Report, v. 9, (1979), 27-34; H AACK, S., “The
Ethics of Belief Reconsidered,” in H AHN, L. E. (ed.), The Philosophy of Roderick M. Chisholm, Open
Court, La Salle (IL) 1997, pp. 129-144; VORSTENBOSCH, J., “W. K. Clifford’s Belief Revisited,” in
MEIJERS, A. (ed.), Belief, Cognition, and the Will, University of Tilburg, Tilburg, 1999, pp. 99-111;
FELDMAN, J., “The Ethics of Belief,” Philosophy & Phenomenological Research, v. 60, no. 3, (2000),
p. 667; and PRYOR, J., “Highlights of Recent Epistemology,” The British Journal for the Philosophy
of Science, v. 52, n. 1, (2001), pp. 95-124.
70
See, for example, JAMES, W., The Will to Believe and Other Essays in Popular Philosophy, Dover,
New York, 1956, pp. 17-30. See also OWENS, D., “John Locke and the Ethics of Belief,” Locke
Newsletter, v. 30, (1999), pp. 103-127; A DLER, J. E., “Ethics of Belief: Off the Wrong Track,” Midwest
Studies in Philosophy, v. 23, (1999), pp. 267-285; and M ADIGAN, T. J., “The Virtues of Ethics of
Belief,” Free Inquiry, v. 17, no. 2, (1997), pp. 29-33.
71
Cf. K ING, M. L., “Letter from Birmingham Jail,” in H ARRIS, P. (ed.), Civil Disobedience, University
Press of America, Lanham (MD), 1989, pp. 58 and 70; R AWLS, J., A Theory of Justice, Harvard
University Press, Cambridge (MA), 1971, pp. 363-377.
72
Quoted in FORTAS, A., “Concerning Dissent and Civil Disobedience,” in H ARRIS, P. (ed.), Civil
Disobedience, p. 91 (see also pp. 91-105) and in K ENNY, A., Thomas Moore, Oxford University Press,
New York, 1983, p. 1.
70 Science, Technology and Society: A Philosophical Perspective
to serve the ends for which the state exists, people sometimes must act against the
alleged civil law before the most serious violations of their rights occur; otherwise,
the violations might become impossible to remove.73 The same could be said of
violations of scientific objectivity. Other guidelines for reform, fifth, have been
offered in the many discussions of conditions for justified whistleblowing. The
important point, however, is not to give necessary and sufficient conditions, at
this stage, but to argue instead that most philosophers and most scientists likely
have some duties to secure objectivity, to reform science and technology. Most
have not accepted this fundamental duty.
Not to attempt reform would amount to a self-fulfilling prophecy, a counsel for
despair. More than a decade ago, Alasdair MacIntyre diagnosed grave “disorders of
moral thought and practice” in society.74 He made a misdiagnosis. Moral dissensus,
as such, is not a problem. Dissensus may exist because a situation is unclear,
because people disagree over how to interpret data, or because ethics consists of
abstract principles that need to be interpreted and amended through democratic
processes in concrete cases. Yet dissensus at the concrete level often is evidence
of some consensus at the abstract or general level. Social knowing, requiring give-
and-take, working through disagreement and criticism, is a necessary condition
for objectivity, not a sign of its failure, as MacIntyre thought. Without this give and
take, there is only apparent consensus, not objectivity, because people probably
have not rationally agreed on a position. Apparent consensus probably signals
that people are lazy, or live in fear, or are forced to agree, or lack the ability
or intelligence to debate ethical issues. Dissensus then is not only a necessary
condition for objectivity but can be evidence of an open, rather than a repressive,
society; a result of increased moral autonomy; or a consequence of the freedom
to develop a life that allows for alternative thoughts and actions. Dissensus is far
superior to unthinking or coerced consensus, repression, or passivity that fears
disagreements. Dissensus, or the conflict necessary to achieve reform of science
and technology, is psychologically and politically discomforting. It means people
must work out their differences, compromise in order to achieve a noble goal.
Besides, from an ethical point of view, consensus is irrelevant. Actions are not
right or wrong because people agree they are right or wrong, but because there is
a rational justification for their rightness or wrongness.
7. CONCLUSION
Because he takes seriously the necessity of reform, even civil disobedience,
black activist Martin Luther King goes so far as to say that the “white moderate,”
the proponent of neutrality who fears dissensus, is a greater threat to freedom,
ethics, and justice than the Ku Klux Klanner who is racist and who lynches blacks.
73
Cf. LOCKE, J., Second Treatise on Civil Government, Prometheus Books, Buffalo (NY), 1986,
Sections 159, 160, 220, 240 and 242.
74
Cf. M ACINTYRE, A., After Virtue, Duckworth, London, 1981, p. 6.
Objectivity and professional duties regarding science and technology 71
“Moderates” are dangerous precisely because people think they are balanced,
objective, and therefore ethical. King worries about moderates because they
are more devoted to order than to justice. He says they forget law and order are
means to the end of justice, and not the reverse. Martin Luther King questioned
whether religion was too bound up with the status quo to save the world.75 Perhaps
scientists and philosophers are too bound up with the status quo to help reform
science and technology?
8. BIBLIOGRAPHY
A DAMS, P., “A State’s Well-oiled Injustice,” World Press Review, v. 43, n. 1, (1996),
pp. 14-15.
A DLER, J., “Reasonableness, Bias, and the Untapped Power of Procedure,”
Synthese, v, 94, n. 1, (1993), pp. 105-125.
A DLER, J. E., “Ethics of Belief: Off the Wrong Track,” Midwest Studies in
Philosophy, v. 23, (1999), pp. 267-285.
AINGER, K., “Interview with Owens Wiwa,” New Internationalist, n. 351, (2002),
pp. 33-34.
A LCOFF, L. M., “Objectivity and Its Politics,” New Literary History, v. 32, n. 4,
(2001), pp. 835-848.
ANDERSON, A., “A Day in the Death of Ideals,” New Scientist, v. 148, n. 2005,
(1995), p. 3.
ANDREWS, R. N., “Environmental Impact Assessment and Risk Assessment,” in
WATHERN, P. (ed.), Environmental Impact Assessment, Unwin Hyman, London, 1988,
pp. 85-97.
A PEL, K.-O., Trasformation der Philosophie, Suhrkamp, Frankfurt, 1976.
Translated into English by G. Adey and D. Frisby: A PEL, K.-O., Towards a
Transformation of Philosophy, Routledge, London, 1980.
AUDI, R., The Structure of Justification, Cambridge University Press, N. York,
1993.
BAYLES, M., Professional Ethics, Wadsworth, Belmont (CA), 1981.
BEVIR, M., “Objectivity in History,” History and Theory, v. 33, n. 3, (1994), pp.
328-344.
BIELSKI, V., “Shell’s Game,” Sierra, v. 81, n. 2 (1996), pp. 30-36.
BORNSTEIN, R., “Objectivity and Subjectivity in Psychological Science,” Journal
of Mind and Behavior, v. 20, n. 1, (1999), pp. 1-16.
BOYD, W., “Death of a Writer,” The New Yorker v. 71, n. 38, (1995), pp. 51-55.
BROWN, H. I., Perception, Theory and Commitment, The University of Chicago
Press, Chicago, 1977.
75
Cf. K ING, M. L., “Letter from Birmingham Jail,” in H ARRIS, P. (ed.), Civil Disobedience, p. 69.
72 Science, Technology and Society: A Philosophical Perspective
QUINE, W. v. O. and ULLIAN, J., Web of Belief, Random House, N. York, 2nd. ed.,
1978.
R ASMUSSEN, N. C., “Methods of Hazard Analysis and Nuclear Safety Engineering,”
in MOSS, T. and SILL, D. (eds.), The Three Mile Island Nuclear Accident, New York
Academy of Science, N. York, 1981, pp. 56-57.
R AWLS, J., A Theory of Justice, Harvard University Press, Cambridge,
Massachusetts, 1971.
R EALE, M., “Axiological Invariants,” Journal of Value Inquiry, v. 29, n. 1, (1995),
pp. 65-75.
R EICHHARDT, T., “Academy Backs Science in Endangered Species Act,” Nature,
375, n. 6530, (1995), p. 349.
R EICHHARDT, T., “Inadequate Science in US Habitat Plans,” Nature, v. 397, n.
6717, (1999), p. 287.
R ESCHER, N., “Collective Responsibility,” Journal of Social Philosophy, v. 29, n.
3, (1998), pp. 46-58.
R IP, A., “Experts in Public Arenas,” in OTWAY, H. and PELTU, M. (eds.), Regulating
Industrial Risks, Butterworths, London, 1985, pp. 94-110.
ROWE, J., “Ralph Nader Reconsidered,” in SCARPITTI, F. and CYLKE Jr., F. (eds.),
Social Problems: The Search for Solutions: An Anthology, Roxbury Publishing
Company, 1st ed., 1995, pp. 62-77.
RUDNER, R., Philosophy of Social Science, Prentice Hall, Englewood Cliffs (NJ),
1966.
SAMUELS, S., “The Arrogance of Intellectual Power,” in WOODHEAD, A., BENDER,
M. and LEONARD, R. (eds.), Phenotypic Variation in Populations, Plenum, N. York,
1988, pp. 113-120.
SCARLOTT, J., “Ralph Nader,” in DELEON, D. (ed.), Leaders from the 1960s,
Greenwood Press, Westport (CT), 1994, pp. 331-332.
SCHEFFLER, I., “Vison and Revolution: A Postscript on Kuhn”, Philosophy of
Science, v. 39, n. 3, (1972), pp. 366-374.
SCOTT, P., “Captives of Controversy: The Myth of the Neutral Social Researcher,”
Science, Technology, and Human Values, v. 15, no. 1, (1990), pp. 474-494.
SCRIVEN, M., “The Exact Role of Value Judgments in Science,” in K LEMKE, E.,
HOLLINGER, R. and K LINE, A. (eds.), Introductory Readings in the Philosophy of
Science, Prometheus, Buffalo, 1982, pp. 269-297.
SELLARS, W., Philosophical Perspectives, C. Thomas, Springfield (IL), 1967.
SEN, A., Objectivity and Position, University Press of Kansas, Lawrence, 1992.
SHER, G., Beyond Neutrality: Perfectionism and Politics, Cambridge University
Press, N. York, 1997.
SHRADER-FRECHETTE, K., “Recent Changes in the Concept of Matter: How Does
‘Elementary Particle’ Mean?,” in ASQUITH, P. D. and GIERE, R. N. (eds.), Philosophy
Objectivity and professional duties regarding science and technology 77
In this work I deal with the relationship between epistemic and non-epistemic
values and the methodological learning in a particular kind of scientific activity,
namely regulatory science.1 I will begin by discussing the main studies on the
relationship between science and values, analyzing different authors’ works on
the change of epistemic values. It is in this analytical context that the topics of
axiological and methodological learning in science appear.
Later, I will analyze some of the main controversies that have taken place in the
last two decades in a particular type of regulatory science, that is risk assessment.
This analysis shows that it is impossible to understand the transformations that
have taken place in this activity (i.e., methodological change) without keeping
in mind the relationship between epistemic and non-epistemic values in risk
assessment. The conclusion is that only an approach that considers these non-
epistemic values can offer a complete understanding of methodological change in
regulatory science. The denial of the influence of these values first of all makes it
difficult to analyze this kind of scientific research, but also cuts off methodological
and social learning which could arise from such as understanding.
83
84 Science, Technology and Society: A Philosophical Perspective
For Shapere, the changes in the aims of science or in the criteria of rationality
themselves are intimately connected to changes in our substantial beliefs about
the world. Aims and criteria are proposed and modified, as happens with low-level
theories. We not only learn, but we learn to learn. The aims of science change over
time, and the reasons for this change are determined by the content of science at
a given time, by its rules, its methods, the substantial beliefs and the interaction
among all these components, in such a way that what is considered a legitimate
successor also changes. The ontological and methodological commitments that
justify scientific change also change. And this change is justified by the previous
commitments and the scientific results which these have brought about. Therefore,
Shapere doesn’t have to introduce changes of the high-level units (which he calls
scientific domains) nor scientific revolutions. A gradual change, product of the
interactions among ontological and methodological commitments, reasoning
principles, scientific results, etc., is sufficient to explain the change of particular
scientific theories as well as changes in the criteria of evaluation themselves.
In his book Science and Values, Laudan arrives at conclusions similar to
those of Shapere. According to Laudan, most of the philosophical approaches
concerning scientific change take for granted a common model of justification.
This model has three levels: (i) laws and theories; (ii) methodological rules; and
(iii) statements concerning aims, as well as basic cognitive or epistemic values.
The controversies on an inferior level are solved applying the principles of the
immediately superior level. Laudan states that this model doesn’t agree with what
one can observe from the history of science: there are cases in which, for example,
aims or epistemic values are modified appealing to scientific methodology or
scientific theories. Therefore, according to him, we should discard the hierarchical
order implicit in the hierarchical model in favor of an egalitarian principle that
stresses the patterns of mutual interdependence among the different levels.5
In contrast to the hierarchical model, Laudan suggests a reticulated model of
justification: the cognitive aims justify the methodological principles, and these
show the feasibility of those; the methodological principles justify the theories,
but these limit those; and, lastly, there should exist a harmony between scientific
theories and cognitive aims. Not only is there a change of scientific theories,
but also methodological and axiological change. The two most important
characteristics of the reticulated model are (a) the non-linear conception of
justification and (b) the gradual character of scientific change (theoretical,
methodological and axiological).
Laudan criticizes both the hierarchical point of view defended by authors like
Hempel, Reichenbach or Newton-Smith, and the holistic one defended by Kuhn.
Laudan proposes his reticulated model in order to show in which way axiological
5
Cf. LAUDAN, L., Science and Values. The Aims of Science and their Role in Scientific Debate,
University of California Press, Berkeley, 1984, p. 63.
86 Science, Technology and Society: A Philosophical Perspective
debate and the formation of consensus in relation to the aims of science are
possible. The scientific controversies on one level can be solved by appealing to
the consensus reached on a different level. The different levels of controversy and
consensus constitute a framework of justification whose components maintain
relationships of interdependence. This interdependence explains the gradual
character of certain episodes of scientific change, which without further analysis
could give the impression of being holistic.
Another philosopher that also considers the problem of changes in scientific
rationality is McMullin. He defines scientific rationality as the relationship
between methods and aims or values. Therefore, the question about changes in
the patterns of scientific rationality is related to the question about changes in
epistemic values. For McMullin this is a debate started by the work of Kuhn, and
continued in the works of Shapere and Laudan.
McMullin exposes the possibility of changes in the cognitive values of
science.6 He illustrates his point of view with several examples. In one of these he
compares the astronomy of pre-Hellenic Greece with that of Babylon. Because of
its socio-cultural context, Babylonian astronomy was geared towards prediction,
a goal in fact related to omens. Babylonian astronomy has an empirical character,
including a rich observational base, accuracy in the obtained data and a great
predictive capacity. It lacks, however, interest as to the causes of the trajectories
of celestial bodies. The cultural context of the Greek cities gave rise, according
to McMullin, to a different type of astronomy, based mainly on the intention
of explaining the observed phenomena. To understand is, among other things,
to know how an entity endowed with a certain nature behaves under normal
circumstances. The prediction is here of secondary interest.
In this and other examples given by McMullin’s we can see that the cognitive
values change and that this change can be driven by external factors. McMullin’s
position is, nevertheless, that the very development of science gradually eliminates
the influence of the non-epistemic values, and that changes in those epistemic
values are caused by scientific development itself.
Before concluding this section I would like to make reference to the general
implications of Laudan’s approach for the philosophy of science. Laudan states that
although the methodological rules are often expressed in the form of categorical
imperatives, their very form is that of hypothetical imperatives. This means that
the maxim ‘lets reject ad hoc hypotheses’ must be understood as ‘if you want to
get new fruitful theories, you must reject the ad hoc hypotheses’. This kind of
6
Cf. MCMULLIN, E., “The Goals of Natural Science,” in H RONZSKY, I. ET AL. (eds.), Scientific
Knowledge Socialized, Akadémiai Kiadó, Budapest, 1988, pp. 27-58; and MCMULLIN, E., “The
Shaping of Scientific Rationality: Construction and Constraint,” in MCMULLIN, E. (ed), Construction
and Constraint: The Shaping of Scientific Rationality, University of Notre Dame Press, Notre Dame
(IN), 1988, pp. 1-47.
Metascientific Analysis and Methodological Learning in Regulatory Science 87
use value judgments to deal with research situations involving incomplete data
or methods.” 11
Methodological value judgments come into play whenever a scientist makes
an inference on how to treat unknown cases, what statistical tests to use, how
to determine sample size, establishing where the burden of proof lies, what
theory or model to use, the acceptability of the interpolation of lost data, if it is
correct to extrapolate the data from the laboratory to field trials, if the incomplete
information on a phenomenon is enough to extract conclusions, etc. That scientists
need to make methodological value judgments means that they must judge their
own methods. Such judgments can be correct or erroneous.
One of Shrader-Frechette’s numerous case studies is related to the scientific
controversy surrounding the construction of an underground nuclear repository
in Maxey Flats (Kentucky). Some of the studies completed previous to the
construction of the nuclear repository calculated that the plutonium would be
displaced by only half an inch in 24.000 years. When the installation was again
opened after ten years, plutonium was discovered in a two-mile radius from
its original location. The geological predictions were erroneous by six orders
of magnitude.12
The construction of this nuclear repository was preceded by a controversy
between two groups of scientists. The controversy started because of the lack of
data regarding the suitability of the ground with respect to avoiding the possible
displacement of radioactive material due to underground water currents. In this
situation, both groups opted for different hierarchies of cognitive values. One of the
groups gave priority to the scientific community’s majority point of view regarding
the capacity of plutonium to migrate in that kind of soil. In other words, this group
adhered to the criterion of external consistency, which translated into a number of
strategies to test the impermeability of the soil. For the other group, on the contrary,
the internal coherence prevailed and it insisted on the porosity of the soil.
In this example the scientific controversy had important, and maybe dramatic,
social consequences. In this type of situations, Shrader-Frechette argues, Laudan’s
reticulated model is insufficient. It would be necessary to add an additional
level on which moral values like the protection of citizens from possible leaks
of radioactive material were contemplated.13 In applied science with important
social consequences the criteria for the selection of hypotheses cannot be the
11
SHRADER-FRECHETTE, K., Ethics of Scientific Research, Rowman and Littlefield, Lanham, 1994, p.
53. On bias and cultural values in science see LONGINO, H. E., Science as Social Knowledge: Values
and Objectivity in Scientific Inquiry, pp. 62-103.
12
Cf. SHRADER-FRECHETTE, K., Risk and Rationality: Philosophical Foundations for Populist
Reforms, University of California Press, Berkeley, 1991; and SHRADER-FRECHETTE, K., “Hydrogeology
and Framing Questions Having Policy Consequences,” Philosophy of Science, v. 64, (1997), pp.
S149-S160.
13
SHRADER-FRECHETTE, K., “Scientific Progress and Models of Justification,” in GOLDMAN, S., (ed),
Science, Technology, and Social Progress, Lehigh University Press, Bethlehem, 1989, pp. 196-226.
Metascientific Analysis and Methodological Learning in Regulatory Science 89
In its well-known 1983 report, the National Research Council defines risk
assessment as a research process with the following four steps:
1. Risk identification. Characterizing the nature and scope of evidence
indicating that a substance can increase the incidence of disease (cancer,
birth defects, etc...) in humans, laboratory animals, or other test systems.
2. Quantification of the dose-response relationship. Calculating the incidence
of an effect as a function of exposition in several populations, extrapolating
from high doses to low doses, and/or laboratory animals to human beings.
3. Exposition analysis. Identifying the populations that are, or could be,
exposed to a substance in certain circumstances.
4. Risk characterization. Estimating the incidence of effects on health under
different conditions in each of the populations. In this step the information
obtained in each of the previous steps is used.
Risk characterization, the fourth step of risk assessment, is generally conceived
as a synthesis and translation of the information. Synthesis of the information
obtained in the three previous steps and translation of that information in the
sense of showing its meaning for health and environment protection in such a way
that it is useful for policy makers. In this sense, in the 1983 report the necessity
of specifying the possible consequences of current uncertainties in the previous
steps is pointed out. In this conception of risk assessment the first three steps
would be directly related to scientific knowledge, while the fourth one would
include mainly meta-analysis and prediction.
The conclusions of risk characterization are used for risk management,
which mainly consists of drawing up different types of regulations regarding
the use of products and productive processes. The traditional conception of the
relationship between risk assessment and risk management is that assessment
is a scientific activity which provides evidence about the nature and scope
of risks, while management is in charge of making regulations, taking into
account this evidence and the socially established levels of protection of public
health and the environment. This conceptualization of the relationship between
risk assessment and risk management is related to the traditional distinction
between facts and values.
In its 1983 report the National Research Council clearly defends the separability
between assessment and management: if considerations related to management
affect risk assessment, the credibility of the assessment can be compromised.
A similar point of view was expressed again in the 1994 report: it defends the
necessity for risk assessment to be independent, and for explicitly distinguishing
between conclusions based on facts and judgments based on values. The 1996
report questions this point of view and emphasizes the interaction between
assessment and management of risks.20
20
Cf. NATIONAL R ESEARCH COUNCIL, Understanding Risk. Informing Decisions in a Democratic
Society, National Academy of Sciences, Washington, DC, 1996.
92 Science, Technology and Society: A Philosophical Perspective
Possibly due to its social relevance, risk assessment has suffered a wide-
reaching process of methodological analysis. As we will see in the following,
this analysis has lead to a learning process regarding the methodologies that can
best fulfill the practical values of risk assessment.21 These practical values are
to provide scientific knowledge about risks which is useful for the protection of
public health and the environment. The controversies that have surrounded this
type of regulatory science in the last two decades can be classified in four areas:
1) the burden of proof, 2) risk definition and identification, 3) standards of proof,
and 4) rules of inference.
Another empirical argument (in this case, a comparative one) points out that
in the economic sectors in which a shift of the burden of proof has already taken
place, for example the pharmaceutical sector, public health is better protected
than if the burden of proof rested with the public administration. The correctness
of these empirical arguments depends on, for example, the characteristics
of contemporary society and its institutions, the level of innovation and the
characteristics of chemical substances and, particularly, the risks these can involve
for public health and the environment. Statements about these characteristics are
subject to an empirical analysis regarding their correctness or incorrectness.
For some authors it is necessary to analyze the social consequences of regulation
and their level of permissiveness. The cumulative experience of the last twenty
years regarding the assessment and regulation of risks indicates that sometimes
the control of a risk through regulation produces the emergence of other risks
(risk tradeoffs). This means that regulating a risk can become harmful for the
protection of health and the environment if the countervailing risk is bigger.23
Therefore, the effectiveness of the regulation depends on numerous factors.
Cass Sunstein argues that the relationship between health, wealth and safety
must be analyzed since the data indicates that a decrease of wealth increases
the risks to public health. Sunstein says that an increasing number of research
projects in the area indicate that lives are being lost as a consequence of the
obligatory cost of regulation, and that there are reasons for the government to
take this problem seriously.24 However, Shrader-Frechette points to the utter
lack of empirical evidence in terms of controlled experiments or statistical
analysis for Sunstein’s causal assumption that regulation increases risk because
it increases poverty.25
These considerations in this section indicate that the effectiveness of a
proposal of a political nature like the shift of the burden of proof depends on
numerous factors like certain economic and administrative characteristics, the
nature of risks, the relationship between regulated risks and countervailing risks,
etc. The determination of these factors depends on social education about risks
and its regulation. Scientific knowledge is one of the key factors that contribute
to this learning process.
The shift of the burden of proof has also been relevant to the development
of some lines of scientific research and technological innovation. This has been
the case with the European regulation of biotechnology. This regulation, guided
by the precautionary principle, shifts the burden of proof. It has decisively
23
Cf. GRAHAM, J. D. and WIENER, J. B., “Confronting Risk Tradeoffs,” in GRAHAM, J. D. and
WIENER, J. B. (eds), Risk vs. Risk. Tradeoffs in Protecting Health and the Environment, Harvard
University Press, Cambridge, 1995, pp. 1-41.
24
Cf. SUNSTEIN, C. R., “Health-Health Trade-Offs,” in ELSTER, J. (ed.), Deliberative Democracy,
Cambridge University Press, Cambridge, 1998, pp. 232-259.
25
Cf. SHRADER-FRECHETTE, K, “Review of Risk and Reason (Cass Sunstein, 2002, Cambridge
University Press),” Notre Dame Philosophical Reviews, (2003.04.09).
94 Science, Technology and Society: A Philosophical Perspective
2. Current risk assessment is done under the supposition that risks affect
everybody in the same way, and has not paid attention to biological and
social diversity.
3. Disease has been considered as the expression of the damage caused by
substances, other impacts have not been considered.
4. It has been supposed that substances act independently, instead of analyzing
the synergies and interactions that can take place among them.
5. The standards of proof required for establishing the relation between the
presence of a substance and a damage are too demanding.
These criticisms have forced changes in risk assessment and in general in
scientific research devoted to health and environmental risks. The characterization
of damage has been extended beyond cancer and disease. Problems related
to sexual development, reproductive problems like infertility and cognitive
dysfunctions have gotten the attention of researchers. The so-called hormonal
disrupters are the best-known example. The effects of environmental pollution
on wildlife and ecosystems has begun to be considered as indicators for human
health.32 Physiologic abnormalities that can be found in the blood or the adipose
tissue are also used in this way, i.e. as biological markers for the effects of pollution
on health. Also, guidelines have been written up for risk assessments to consider
human diversity regarding age, genetic susceptibilities, etc.
The controversy on biotechnology in Europe has also led to a redefinition of
the possible damage. In the beginning, the possibility of genetically modified
plants generating resistance in insects to certain insecticides had been considered
by European authorities as an agronomic problem. But beginning with the 1996
controversy over the registration of a variety of transgenic corn, this possibility
began to be considered as an adverse effect. As a consequence, scientific
methodologies were developed for its study.33
The conceptualization of certain effects of human actions as risks depends
on social concerns. These concerns have influenced risk assessment in a decisive
way. And these concerns are also related to learning about the effectiveness of risk
assessment and risk management. The necessity to extend the characterization
of harm has been argued on the grounds that the assessment and regulations
centered on cancer and diseases were not leading to the desired level of protection
of public health through regulation. This conclusion is also a consequence of the
scientific and social learning about risk during the last decades.
32
Cf. TESH, S. N., Uncertain Hazards. Environmental Activists and Scientific Proof, p. 67, and
K RIEBEL, D. ET AL., “The Precautionary Principle in Environmental Science,” Environmental Health
Perspectives, v. 109, n. 9, (2001), pp. 871-876.
33
Cf. LUJAN, J. L. and TODT, O., “Dinámica de la precaución. Sobre la influencia de los conflictos
sociales en la regulación de los OGMs,” pp. 149-151.
96 Science, Technology and Society: A Philosophical Perspective
similar to diseases which are the result of other causes, they operate through
unknown molecular or sub-molecular mechanisms, their consequences are
catastrophic for the affected individual, although their probability is low. These
characteristics are very well known today due to the knowledge accumulated by
scientific research. The cumulative experience of the analysis of these substances
is also important for other reasons: we know what we do not know, and we know
about the difficulties of obtaining a deeper knowledge of them.
Moreover, Cranor analyzes the epistemic characteristics of scientific
research regarding biochemical risks, focusing especially on toxicology. Cranor
asserts that:
“Scientific bodies and most scientists are typically quite demanding in
minimizing or preventing factual false positives, that is, that their procedures
show that a substance has a toxic property when in fact it does not. They want
to ensure that they are not mistakenly claiming that a substance is toxic when
it is not. They tend to be cautious in coming to such conclusions.
One instance of this practice is that scientists guard against random
statistical errors in their experiments from producing false positive results by
demanding that support for their conclusions must be statistically significant.
This is only one statistical measure that is used in scientific inquiry, but a
particularly easy one to utilize and quantify. Moreover, a focus on preventing
false positives in statistical tests will as a matter of the mathematics involved
increase the number and rate of false negatives. And, at least in research,
scientists appear to have a lesser concern to prevent false negatives.” 35
The greater concern for the false positive is a methodological translation of
an epistemic value, that is, accuracy. The concern for the false positive leads to
require certain kinds of evidence in order to reach conclusions concerning the
toxicity of a substance. Cranor’s argument points out the consequences of the
interaction between the characteristics of chemical substances and the epistemic
characteristics of the scientific research on risk. The characteristics of toxics
make it very difficult to establish links and causal trajectories. And the epistemic
characteristic of research on the risks of toxic products makes it necessary to
acquire knowledge of those links and causal trajectories. The combination of
both factors, characteristics of toxics and epistemic characteristics of research,
leads to research which is intensive both in time and resources.
Cranor reaches the conclusion that this type of risk assessment has undesirable
social consequences. His work is an analysis of the social consequences of using
certain methodologies. He maintains that the scientific epistemology, at least
in these extreme forms, is not normatively neutral when it is used for social
applications. In the particular case of the risk assessment of toxic substances, a
35
CRANOR, C. “Conocimiento experto y políticas públicas en las sociedades tecnológicas. En busca
del apoyo científico apropiado para la protección de la salud pública,” p. 110.
98 Science, Technology and Society: A Philosophical Perspective
conflict arises between an epistemic value like accuracy and a practical value like
the protection of public health.
There are other proposals similar to those of Cranor. Some authors defend
that in order to protect the environment and public health an analysis based on
the weight of evidence is better than trying to acquire an exact determination of
the level of risk involved.
“The weight-of-evidence approach to decision-making takes into account
the cumulative weight of information from numerous sources that address the
question of injury or the likelihood of injury to living organisms. Types of
information that might be considered include observational studies, worker
case histories, toxicological studies, exposure assessments, epidemiological
studies, and monitoring results. Based on the weight of evidence, a
determination is made as to whether an activity has caused or is likely to
cause harm and the magnitude of that harm.” 36
In a way this approach also implies a weakening of the standards of proof,
and considers in a generic way the evidence accumulated about the relationship
between substances and health and ecological problems. But it doesn’t only consist
in relaxing the standards of proof, but in addition it calls for taking into account
all the available information coming from different sources. It is possible that no
type of information by itself would be enough to affirm a causal relationship, but
the combination of available information can be enough for decision-making.37
Of course, this approach has consequences for the development of scientific
research programs. The arguments in defense of this approach are similar to
those analyzed in the case of short-term tests: it is a better way of reaching the
objectives of protecting the environment and public health.
The American regulatory agencies have changed the standards of proof
from the 70s until now. In the guidelines published by the EPA in 1976 for
the assessment of carcinogenic substances it was considered that the evidence
in humans (epidemiological studies) was fundamental to the identification of
carcinogens and to the establishment of the dose-response relationship. The
1983 NRC report insisted on the relevance of human evidence, but it recognized
the difficulty of obtaining and interpreting it, and recognized that in many
cases it was necessary to appeal to data coming from bioassays. In 1985 the
Office of Science and Technology Policy analyzed the problems presented by
epidemiological studies for establishing causal relationships. In the guidelines
of 1986, the EPA recommended carrying out global evaluations of the evidence
coming from epidemic studies, bioassays and other information coming from
36
TICKNER, J. A., “A Map toward Precautionary Decision Making,” p. 169.
37
Some analysts that defend this approach consider that the weight of evidence must be inversely
proportional to the possible magnitude of the damage. That is, the higher the possible harm, the less
information would be required to take a decision. Therefore, this proposal is based on the valuation
of the social consequences of the application of a epistemic value like accuracy.
Metascientific Analysis and Methodological Learning in Regulatory Science 99
short-term tests and concerning the relationship between chemical structure and
biological activity. In 1996, the EPA recommended considering all the available
evidence: human, animal as well as supplementary evidence.
there will be mistakes and which mistakes the process is designed to avoid is an
important normative issue. One must make a policy decision (or decisions) about
the degree of accuracy and the importance of the risks to be prevented. Explicit
discussion of alternatives to conventional risk assessment and risk management
practices of scientists should occur because many current practice may frustrate
preventive goals.” 39
The first problem presented by the epidemiological studies is the
identification of diseases. This first step depends on the quality of the gathered
medical information as well as the quality of the information that can be obtained
during the study. The characteristics of some diseases, like long periods of
latency, hinder their identification. When a disease has been identified, it must
be related causally to the presence of some substance. The problem that appears
here is related to the inference of causality from statistical data. The effect of
a substance can be a small variation of the normal rate of deaths from cancer
that could be statistically not significant. In some cases a relationship cannot
be established because the sensibility of the epidemiological study simply does
not allow for this.40
Let us suppose that it has been possible to establish a relationship between
the substance and the disease. Then it is necessary to estimate the dose-response
relationship. As happened with the identification of disease, the problems begin
with the compilation of information. Information is needed on the source of
contamination, the way of exposition (air, land, water, etc.), the channels of
transportation in each medium, physical and chemical transformations, the path
of entrance into the organism, the intensity and frequency of exposition, and
the patterns of spatial and temporal concentration.41 Then there are other, more
simple cases: the epidemiological data coming from accidents. In such cases, it
is possible that the epidemiological data does not offer any doubt regarding the
causal relationship between the exposition to high doses of a particular substance
and a serious disease. The problem is to determine, from this epidemiological
data, what happens in the case of low doses, i.e., to extrapolate from high dose
to low dose. What happens here is that the data is normally compatible with
different mathematical models of extrapolation. The theoretical models are
underdetermined by the available evidence.42
Similar problems show up in studies with laboratory animals. Heather Douglas
has analyzed research on the relationship between dioxins and liver cancer in
laboratory rats. The study was carried out in 1978 and was used for the regulation of
the dioxin levels in the environment. The samples taken from laboratory animals’
livers were re-evaluated twice, in 1980 and in 1990. The classification of these
39
CRANOR, C., “The Normative Nature of Risk Assessment: Features and Possibilities,” p. 128.
40
Cf. M AYO, D. G., “Sociological versus Metascientific Views of Risk Assessment,” pp. 267-275.
41
Cf. TESH, S. N., Uncertain Hazards. Environmental Activists and Scientific Proof, pp. 35-36.
42
Cf. LOPEZ CEREZO, J. A. and LUJAN, J. L., Ciencia y Política del riesgo, pp. 107-114 and 119-130.
Metascientific Analysis and Methodological Learning in Regulatory Science 101
samples as cancerous lesions (benign or malign) was different each of the three
times (the number of cancers found in 1978 was higher than in 1990.). As Douglas
points out, “the judgment of whether a tissue sample has a cancerous lesion or
not has proven to be more subtle than one might initially suppose.” 43 Also, in the
bioassay studies, problems show up related to the models of extrapolation, as in
the case of the epidemiological studies. Other indeterminacies are related to the
extrapolation from animals to humans.
An issue which has been studied in-depth by the analysts of risk assessment
is the higher concern among scientists for false positives than for false negatives
(Cranor, Shrader-Frechette, Douglas). This tendency is the methodological
translation of the search for accuracy, and its goal is to avoid asserting a false
hypothesis. As I have already pointed out, it is impossible to reduce false negatives
and false positives at the same time. The higher concern for the false positives is
translated into an increase of the false negatives. This concern is summed up in the
studies’ election of the levels of statistical significance, which reflects a decision
about which kind of error a scientist is willing to tolerate more easily. These
methodological decisions possess consequences for regulation, and therefore
have social consequences. The false positives lead to over-regulation, while the
false negatives to sub-regulation. In general, the social costs of sub-regulation are
higher than those of over-regulation.44 Several authors have defended the idea that
scientists involved in risk assessment must bear in mind the social consequences
of their methodological decisions. In other words, they should also pay attention
to practical values.
The attention to practical values has been an explicit policy of regulatory
agencies in their elaboration of guidelines for choosing among different dose-
response models. In the studies on non-carcinogenic substances models with
thresholds are used, below which effects are not observable. In the case of
carcinogenic substances, however, models without thresholds are used, assuming
that although the doses are small, they can cause alterations in the DNA that could
lead to the emergence of tumors. In the debates that have taken place during the
last few years regarding the new EPA guidelines for the assessment of carcinogenic
substances, specific models of extrapolation have been explicitly advocated as
being scientifically defensible as well as protective of public health.
43
DOUGLAS, H. “Inductive Risk and Values in Science,” p. 571.
44
In general, although not always. This depends on the possible use of the product and the magnitude
of the risks. Other considerations would be relative to who benefits from the product and who suffers
the risks.
102 Science, Technology and Society: A Philosophical Perspective
evidence required for regulation (standard of proof) and the rules of inference
(in the identification of risks as well as in the establishment of the dose-response
relationship).
It is not the aim of this work to carry out an evaluation of such changes and
proposals. My aim has been to show that such changes and proposals take place
in a learning process relative to the best ways of reaching the goals of protecting
public health and the environment from the risks introduced by products and
productive processes. This learning process has taken place on the basis of the
cumulative experience with respect to the characteristics of toxic substances
and their interactions with human beings and ecosystems, the epistemic
characteristics of scientific research on those toxins and their interactions, and
the effects of the different regulatory systems regarding the protection of health
and of the environment.
In all the cases analyzed, the dynamics has been similar to that proposed in
Laudan’s reticulated model: the learning process is the product of the interaction
among substantive elements of scientific knowledge (in this case statements
about the health and ecological consequences of products and productive
processes), methodological rules and the goals of research. This model is
appropriate if we keep in mind that the goals of risk assessment are to provide
knowledge useful for protecting public health and the environment. That is,
if we accept in some sense the modification of Laudan’s model proposed by
Shrader-Frechette, and the considerations about risk assessment spelled out by
authors like Cranor and Douglas.
Laudan, however, would not agree with these modifications.45 The only values
and goals that he considers are the epistemic ones, and in that sense he doesn’t
distinguish between applied science and academic science. Science is mainly a
matter of belief, and in this realm the relevant issue is to compare the different
theoretical alternatives and to choose the best one according to certain epistemic
values (e.g., preferring the theory best supported by empirical tests).
From a classic point of view, whenever scientific knowledge is used to advise
actions (e.g., regulations or public policies in general), it is necessary to carry out
calculations of the social costs of false positives and false negatives. Therefore, it
is necessary to distinguish between knowledge and its applications, in this case
between risk assessment and risk management. This is the classic point of view
according to which the consideration of non-epistemic values in the research
process distorts scientific knowledge.
Some authors have documented historically the influence of external factors
or non-epistemic values on risk assessment.46 But the classic point of view must
45
Cf. LAUDAN, L., The Book of Risks, pp. 2-42.
46
Cf. JASANOFF, S., The Fifth Branch. Science Advisers as Policymakers, ch. 11, pp. 229-250; and
TESH, S. N., Uncertain Hazards. Environmental Activists and Scientific Proof, ch. 2, pp. 24-39.
Metascientific Analysis and Methodological Learning in Regulatory Science 103
6. BIBLIOGRAPHY
CRANOR, C., Regulating Toxic Substances. A Philosophy of Science and the Law,
Oxford University Press, New York, 1993.
CRANOR, C., “The Social Benefits of Expedited Risk Assessment,” Risk Analisis,
v. 15, (1995), pp. 353-358.
CRANOR, C., “The Normative Nature of Risk Assessment: Features and
Possibilities,” Risk: Health, Safety and Environment, v. 8, (1997), pp. 123-136.
CRANOR, C., “Conocimiento experto y políticas públicas en las sociedades
tecnológicas. En busca del apoyo científico apropiado para la protección de la salud
47
M AYO, D. G., “Sociological versus Metascientific Views of Risk Assessment,” pp. 275-276.
104 Science, Technology and Society: A Philosophical Perspective
Kristin Shrader-Frechette
107
108 Science, Technology and Society: A Philosophical Perspective
swimming in the river). If I did not spend these 2 hours to prevent the child’s
death, when I could do so, at no great sacrifice on my part, arguably I would
be morally reprehensible. Now consider a second case. Suppose I know that
the same river supplies drinking water for East Chicago; that nearby children
(who drink the water), exhibit statistically significant increases of fatal
neurological injuries; and that, by donating only 2 hours of my time to a local
nongovernmental organization (NGO), dedicated to river clean up, I could help
prevent one child’s death from the neurotoxin. If I would be wrong for not
trying to prevent the first child’s death, wouldn’t I also be wrong for not trying
to prevent this second child’s death? Each of faces cases like this, again and
again, in many areas of our lives. Yet most of us do nothing about it.
One reason is that reality is rarely as clear as these examples. On the one
hand, there are many uncertainties, including how to apportion collective
responsibility, how many children would die because of the contaminated river,
how many hours of work are necessary to save one child’s life, how many other
people would join the NGO, and so on. On the other hand, there are dose-
response curves for many of the 80,000 industrial contaminants to which we are
subjected, and many developed nations have Toxic Release Inventories (TRIs),
in which each industry must reveal which, and what quantities of, toxins it
releases. Only a minimal amount of research is necessary, especially for a
university-educated person, to become aware of the magnitude of any given
threat she faces. Thus, if we in democratic nations do nothing at all, either to
become informed about technological problems, to help educate others about
them, or to be active in NGOs seeking to avoid them, when we easily could do
so, we err. Prima facie, we are as much at fault as the person who does nothing
to protect the child swimming in the contaminated river, the person who does
nothing to help reform the practice and use of science and technology. Why
would we be at fault?
2
Cf. ERMAN, D. and PISTER, E., “Ethics and the Environmental Biologist,” Fisheries, v. 14, no. 2,
(1989), p. 7. See also, for example, TURNER, C. and SPILICH, J., “Research into Smoking or Nicotine
and Human Cognitive Performance: Does the Source of Funding Make a Difference?,” Addiction,
v. 92, (1997), pp. 1423-1426.
How to Reform Science and Technology 109
no particular profit agenda, accounts for only 25 percent of all science.3 Given the
private control of most scientific research, it is not surprising that most of us are
ignorant about many effects of science and technology, like toxic chemicals. US
government reports admit that “up to 90 percent of all cancers are environmentally
induced and theoretically preventable.” 4 If so, we need not so much a cure for
cancer, as its prevention. We are literally killing ourselves, because of our misuse
of science and technology. Among developing nations, need for reform is even
greater. The World Health Organization (WHO) claims that pesticides annually
kill 40,000 people in developing nations; they seriously injure another 450,000.
Most of these pesticides are produced in developed countries, banned for use
there, but instead shipped abroad.5 Or consider infant formula, a misused food
technology. In 2001, after the International Baby Food Action Network argued
against aggressive and misleading advertising, marketing, and labeling of infant
formula in developing nations –by companies like Nestlé– President Bush argued
against WHO safeguards. The WHO urges exclusive breast-feeding as safer,
healthier, and cheaper for at least the first 6 months. But the US argued fiercely
against WHO curbs on misleading baby-formula advertising in poor nations. It
said it was “very anxious not to inhibit commercial activity.” 6 Bush’s behavior,
even in the baby-formula case, suggests why industry and government, alone,
cannot do the job of reforming science and technology.7 Without consumer and
public pressure for reform, companies that are more responsible (than Nestlé,
for example), that accept WHO norms, could financially destroy themselves.
Their higher standards, as in the infant formula case, could make them unable
to compete with less scrupulous companies. To expect firms to introduce safer
science and technology, and thus risk being undercut financially by less scrupulous
3
Cf. BEDER, S., Global Spin: The Corporate Assault on Environmentalism, Chelsea Green Books,
White River Junction, 2002, pp. 17-49, 63-71, 141ff and 161ff. Information on lobbying statistics from
Dick Armey, House of Representatives Majority Leader, is taken from A RMEY, D., “Washington’s
Lobbying Industry, Appendix: Measuring the Lobbying Industry,” obtained at www.flattax.gov,
January 5, 2002. See also The Center for Responsive Politics at www.opensecrets.org/lobbyists/
index.asp for information on the lobbying industry and amounts spent by various industries on
lobbyists. Legislative data on the effects of lobbyists, campaign contributions, and PACs, is from
BOX-STEFFENSMEIER, J. and GRANT, J., “All in a Day’s Work: The Financial Rewards of Legislative
Effectiveness,” Legislative Studies Quarterly, v. 25, no. 4, (1999), pp. 511-524. See also COMMON
CAUSE, “Why People Who Value Families Should Care About Campaign Finance Reform,” obtained
at www.commoncause.org, Jan. 2, 2003. Additional lobbying information is from A NSOLABEHERE,
S., SNYDER Jr. J. and TRIPATHI, M., “Are PAC Contributions and Lobbying Linked? New Evidence
from the 1995 Lobby Disclosure Act,” Business and Politics, v. 4, no. 2, (2002), pp. 131-156.
4
US OFFICE OF TECHNOLOGY ASSESSMENT, Assessment of Technologies for Determining Cancer
Risks from the Environment, US OTA, Washington, DC, 1981.
5
Cf. M ATHEWS, J., World Resources 1986, Basic Books, New York, 1986, pp. 48-49. See also
R EPETTO, R., Paying the Price: Pesticide Subsidies in Developing Countries, Research Report
Number 2, December 1985, World Resources Institute, Washington DC, 1985, p. 3.
6
YAMEK, G., “Pop Musicians Boycott Promotion,” British Medical Journal, v. 322, no. 7280, (2001),
p. 191. See also EXETER, P. B., “Campaigners for Breast Feeding Claim Partial Victory,” British
Medical Journal, v. 322, no. 7280, (2001), p. 191.
7
See SHUE, H., “Exporting Hazards,” in BROWN, P., and SHUE, H. (eds), Boundaries: National
Autonomy and Its Limits, Rowman and Littlefield, Totowa, 1981, pp. 130ff, for a similar argument.
110 Science, Technology and Society: A Philosophical Perspective
In omitting abiotic protection, the ICRP errs because it ignores what is most
easily, reliably, and empirically measured, air and water, and what is the “early-
warning signal” for high species doses. Its omitting ecosystem-level risks is
problematic because state-of-the-art ecological risk assessment (ERA) includes
two different levels of methods, the toxicological and the systems level. And its
requiring modeled, not measured, doses to references species is scientifically
flawed because model results would be almost totally dependent on extrapolations
chosen by the modeler, a scientist usually employed by the radiological polluter.
There are no empirical checks and balances; no replication of results; and no
escape from subjective, nonempirical models because estimates will be only
those the modeler judges “likely”,13 not those based on explicit confidence levels,
with statistically measurable uncertainty bounds.
The ICRP’s basing all its environmental protections on doses to some
arbitrarily chosen “reference species” likewise is scientifically indefensible
because it gives no scientific definition of “reference species”; they are simply
species about which modelers have the most information. In using reference
species, the ICRP arguably sanctions science that amounts to the drunk looking
for his watch under the streetlight. Why does the drunk look for his watch under
the streetlight? Not because he lost his watch there, but because that is the only
place he can see. Why does the ICRP sanction use of reference species? Not
because they are species that are important for radiation protection, but because
they are species about which we know something.
Obviously the ICRP recommendations are flawed in the way they do
science, but they also err ethically in the way they use science to defend
regulations. There is a representativeness bias, because all members of the ICRP
committee were chosen, not by independent experts, but by those industrially
and governmentally responsible for radiation protection; because virtually
all members of the committee had done research only on toxicological, not
ecosystem, ERA; and because virtually all members had already written
articles, usually for their nuclear-industry employers, in support of modeled,
rather than measured dose. There also were violations of procedural justice,
because the pro-nuclear chair of the committee, from Sweden, allowed no votes
from the 5 committee-member scientists.
When the US member requested basing all recommendations on the best
science available from top refereed journals, the chair instead defended using
mainly nonrefereed “gray” literature (published by industrial and private
groups). When the American committee member asked the committee to require
uncertainty analysis of estimated doses, the chair simply removed (from the
13
Cf. INTERNATIONAL COMMISSION ON R ADIOLOGICAL PROTECTION (ICRP), A Framework for
Assessing the Impact of Ionising Radiation on Non-Human Species, reference 02-305-02, ICRP,
Vienna, 2003, paragraph 119.
112 Science, Technology and Society: A Philosophical Perspective
report) the written admission that no uncertainty analysis was done. When the
US member asked for peer-review of the document, the committee chair asked
for comments on the draft, posted on the ICRP website, but only the committee
chair had access to the comments. When the US member called for a vote on
the document, both the chair and the ICRP told her the ICRP did not vote. The
draft document, in essentially the same form as the original draft, was published
in 2003. It was published in a deliberately misleading way, listing all committee
names, but without acknowledging that some members did not support it.
What will happen when international scientific protections, like these, rely
merely on models, not measurements? On gray literature, not the best scientific
journals? On a largely nontransparent monitoring system controlled mainly by
those who use (and profit from) nuclear pollution?14 The most obvious effect is
that it will be easier and cheaper to pollute and yet not violate the law. US nuclear
weapons cleanup will cost a trillion dollars; throughout the world, hundreds of
reactors must be expensively decommissioned; and throughout the world, millions
of nuclear workers and atomic veterans are loudly demanding compensation. It
will be cheaper for government and industry to address these problems, if they
have the flawed, nonempirical, nontransparent ICRP norms.
omissions, we are all complicit. Many scientists are minimalists because they
are communally challenged and relationally challenged. Yet most of us would
not say, in response to being called at work, after our child was seriously hurt at
school, “I’m too busy to go to the hospital. I’m a scientist, and I don’t have time
for those ‘outside’ activities. I make my social contribution through my science.”
Such an answer would be appalling. But such an answer also is appalling in
response to the need for each of us to engage in continually working to reform
science and technology. It also sounds like the attitude of the Prussian Academy of
Sciences, when it universally condemned Albert Einstein in 1933, for criticizing
Hitler’s violations of civil liberties. The academy said science required Einstein
to remain neutral.20 Ethics does not always dictate what side one should take, like
Einstein’s, but it does dictate that we all have a moral responsibility to investigate,
to be critical. People don’t have the right to enjoy benefits of membership in the
scientific or philosophical community and, at the same time, claim the right to be
apolitical when that community is misrepresented or fails to do its job.
the East LA air and water. The noxious chemicals explain why only poor Latinos,
like Gutierrez, live in East LA. The median annual income is about one-third of
the US average, and unemployment is 33 percent. Yet Gutierrez and other mothers
and grandmothers were angry that, despite the disproportionate, life-threatening
pollution in East LA, officials wanted to place an above-ground oil pipeline,
another hazardous-waste storage site, and another toxic-waste incinerator in their
neighborhood. The proposed incinerator was slated to burn 125,000 pounds of
hazardous materials per day-including used motor oil and industrial sludge.23
In 1986 Gutierrez joined Aurora Castillo to co-found MELA, Mothers of East
Los Angeles. To inform the Hispanic community about technological threats to
the neighborhood, Gutierrez used her most available network: people streaming
out of Sunday Mass. Through church leafleting, she and other Latina mothers
advertised for protest marches, held every Monday. Pushing baby strollers and
wearing white kerchiefs to symbolize nonviolence, MELA members became
a formidable force.24 Eventually the men began to help with the action. They
carried signs calling themselves the “chauffeurs” of the mothers.
From their church, MELA protestors walked, every week, more than a mile to
the gates of the $ 20 million incinerator project. As they marched, they chanted:
“El pueblo parará el incinerador!” (The people will stop the incinerator!) “Pueblo
que lucha triunfa!” (People who fight win.) The facility owners had sited it in
East LA because they said residents would not fight. Yet Gutierrez and MELA
fought –through 6 years of agitation, 4 lawsuits, 16 hearings, and 6 mile– long
protests. Finally, in June 1991, the Mothers passed around cookies among their
400 members to celebrate cancellation of the incinerator. Soon after, MELA
began a lead-poisoning education project that now employs 10 youths. Defying
“a system that penalizes low-income communities,” Gutierrez and MELA have
dispelled the myth that poor people do not care about technological threats to
their health.25
Why was Gutierrez so successful? She recognized the importance of
collective action. Duties to reform science, like duties to clean up the air in
Gutierrez’s neighborhood, require massive cooperation and collective action.
These obligations are not mainly owed by individuals to individuals, because
individuals cannot act alone and be successful. Instead it is arguable that people
have obligations to promote institutions and policies that aim for fair relations
23
Cf. M ARTINEZ, M., “Legacy of a Mother’s Dedication,” Los Angeles Times, Section B, (September
7, 1995), p. B3; see also p. B1.
24
Cf. SCHWAB, J., Deeper Shades of Green, Random House, N. York, 1994, pp. 55-58. “Mothers’
Group Fights back in Los Angeles,” New York Times, Section A, (December 5, 1989), p. 32.
25
Cf. SCHWAB, J., Deeper Shades of Green, pp. 44-45; M ARTINEZ, M., “Legacy of a Mother’s
Dedication,” p. B1; “Mothers of Prevention,” Time 137, no. 23, (June 10, 1991), p. 25; QUINTANILLA,
M., “The Earth Mother,” Los Angeles Times, Section E, (April 24, 1995), pp. E1, E5; and M ARTINEZ,
M., “Legacy of a Mother’s Dedication,” p. B3. See DELLIES, H., “Group Preaches Gospel of Water
Conservation,” Chicago Tribune, Section 1, (March 20, 1995), p. 3. See also GELOBTER, M., “Have
Minorities Benefited? A Forum,” EPA Journal, v. 18, no. 1, (1992), pp. 32-36.
116 Science, Technology and Society: A Philosophical Perspective
among people.26 But people can always object that they, individually, bear little
or no personal responsibility for collective problems like the practice and use
of science. But as one Worldwatch researcher put it: “Everyone is aboard the
same ship. The Plimsoll line carries the same meaning for all.” 27 Although
everyone contributes to planetary problems, no one –acting alone– can eliminate
the most pressing civic and environmental harms. As a result, precise individual
responsibilities are not clear.
Problems of collective responsibility are illustrated, in part, by the “tragedy
of the commons.” The tragedy is that each person enhances individual gain by
misusing common resources, like scientific knowledge.28 One individual may
profit financially by driving a heavily polluting automobile or by keeping quiet
in the face of misuse of science, but the tragedy occurs because everyone loses
when someone misuses the commons, such as polluting the air we all breathe or
allowing the misrepresentation of science. Because of the tragedy of the commons,
people have a powerful incentive to be “free riders.” 29 Free riders are those who
gain benefits from everyone’s contributing to collective goods, like clean air or
scientific progress, even when they do not themselves contribute.
In the case of philosophers, scientists, and other professionals –all who
enjoy special abilities, roles, and circumstances– the duty not to be a free rider
and to practice collective responsibility is greater than that for people without
such abilities and roles. This and the next paper argue that, provided people use
the criteria for informed, inclusive, deliberative, and critical reform, outlined
subsequently, their behavior will be more ethically defensible than would their
neutrality. Also, given the power of vested interests, the world is like a giant
soccer match, with one team representing the public interest, including science,
and the other team representing private interests. Often the public-interest team
has too few players. Often even government regulators and agencies are recruited
to play on the team representing private interests. As a result, the public-interest
team often has to run uphill to make a goal. Often the contest is not fair. Because
the playing field of government, industry, and society often is tilted, and because
particular individuals, working together, can help to make it level, all citizens
share some collective responsibility to do so.
One difficulty with affirming duties to reform science, however, is that often
these obligations are collective. Ultimate moral responsibility for advocacy
26
See notes 31-42.
27
POSTEL, S., “Carrying Capacity: Earth’s Bottom Line,” in BROWN, L. ET AL. (eds), State of the
World 1994, Norton, New York, 1994, p. 21.
28
See H ARDIN, G., “The Tragedy of the Commons,” Science, v. 162, (1968), pp. 1243-1248;
SWANSON, T. and SANTOPIETRO, G., “The Economics of Environmental Degradation: Tragedy for the
Commons,” Journal of Economic Issues, v. 32, no. 3, (1998), pp. 878-880.
29
See SHRADER-FRECHETTE, K., Environmental Ethics, Boxwood Press, Pacific Grove (CA), 2n ed.,
1991, pp. 165 and 185. See also CONDREAN, C., “Sidney Godolphin and the Free Rider,” Business
and Professional Ethics Journal, v. 17, no. 4 (1998), pp. 5-19; and STROUP, R. L., “Free Riders and
Collective Action Revisited,” Independent Review, v. 4, no. 4, (2000), pp. 285-300.
How to Reform Science and Technology 117
structures,” for reforming science. This third reason is that we all are members of
communities, some of which are global, as the WTO so disastrously affirms. As
Hannah Arendt put it, “this taking upon ourselves the consequences for things
we are entirely innocent of, is the price we pay for the fact that we live our lives
not by ourselves but...[within] a human community.” 33 Her rationale for collective
responsibility is that, “as citizens we must prevent wrong-doing since the world we
all share, wrong-doer, wrong-sufferer and spectator, is at stake.” 34 Gandhi echoed
a similar theme. Community interconnectedness creates responsibility for other
members of the community: “Whenever I live in a situation where others are in
need... whether or not I am responsible for it, I have become a thief.” 35
Emphasizing that social groups enable individuals to do more harm and more
good than they could otherwise do, Larry May says communities also create
more responsibility for those whose lives are woven into the fabric of the group
itself. Certainly, as public intellectuals, our lives are more woven into the fabric
of society than the lives of those who are not public intellectuals. May argues,
correctly, that the benefits of community membership accrue only at the cost
of increased responsibility on the part of the members: group membership is a
source of both benefits and responsibilities, and greater benefits are not possible
without greater responsibilities. Group membership creates heightened moral
duties, in part because groups often are able to transform individual values,
and individuals often are able to transform group values. Psychological studies
show, for example, that racism is in part the result of socialization and interaction
within certain kinds of social groups. But if groups influence individuals, and
individuals influence groups,36 then individuals bear some responsibility for
group actions and omissions. As Joel Feinberg warns:
“No individual person can be blamed for not being a hero or a saint...but a
whole people can be blamed for not producing a hero when the times require
it, especially when the failure can be charged to some discernible element in
the group’s “way of life” that militates against heroism.” 37
Confronting the evils of Nazism, German and French philosophers, such as
Jean-Paul Sartre, Karl Jaspers, and Hannah Arendt, also grounded collective
responsibility in community and interdependence. Jaspers writes:
“There exists a solidarity among men as human beings that makes each
co-responsible for every wrong and every injustice in the world, especially
for crimes committed in his presence or with his knowledge. If I fail to do
33
Quoted in MAY, L., Sharing Responsibility, The University of Chicago Press, Chicago, 1992, p. xi.
34
Quoted in M AY, L., Sharing Responsibility, p. xi. See also H ART, J., “Hannah Arendt: The Care
of the World and of the Self,” in DRUMMOND, J. (ed), Phenomenological Approaches to Moral
Philosophy, Kluwer, Dordrecht, 2002.
35
Cited in GOULET, D., The Cruel Choice, Atheneum, New York, 1971, p. 133.
36
Cf. M AY, L., Sharing Responsibility, p. 152.
37
FEINBERG, J., Doing and Deserving, Princeton University Press, Princeton, 1970, p. 248.
How to Reform Science and Technology 119
whatever I can to prevent them, I too am guilty. If I was present at the murder
of others without risking my life to prevent it, I feel guilty in a way not
adequately conceivable either legally, politically or morally.” 38
Jean-Paul Sartre echoes a similar theme. “If someone gives me this world with
its injustices, it is not so that I may coolly contemplate them but so that I may
animate them by my indignation, expose them and show their nature as injustices,
that is, as abuses to be suppressed.” 39
In order to expose injustices for what they are, metaphysical guilt forces
people to reassess who they are. To avoid moral guilt for great social harms people
must sometimes change who they are. They must become more virtuous, more
authentic, and create themselves in new ways. Authenticity consists, in part, of
accepting responsibility for the harms committed by the group to which people
belong. No matter how restricted people’s options are, Sartre believes they can
choose authenticity. At least they have the choice of what stance to adopt toward
the injustices around them Jaspers says people choose in “individual solitude”
to transform their approach to the world,40 to transform their attitude, character,
and perhaps their behavior. People cannot choose their parents or nationality, for
example, but they can choose their attitudes toward them.
According to Jaspers’ account of metaphysical guilt, people do not have
responsibility merely for their conscious intentions and deliberations. Instead,
as Aristotle noted, because they have partial control over their attitudes, virtues,
and character, they also are responsible for who they are and become. One way
for people to exercise control over their characters is to be sensitive to how their
attitudes affect others. And attitudes toward science affect others in powerful
ways, as contemporary public health and environmental problems reveal. As
Aristotle notes: “While no one blames those who are ugly by nature, we blame
those who are so owing to want of exercise and care.” 41
In short, following Aristotle and May, insofar as people share in the
production of an attitudinal climate, May says they participate in some group
that increases or decreases harm. Collective responsibility is not an important
theme in contemporary postmodernism, which tends to be nihilistic. Nor is it
an important theme in contemporary analytic philosophy, which tends to be
38
JASPERS, K., The Question of German Guilt, trans. E. Ashton, Capricorn Books, New York, 1961,
p. 36.
39
SARTRE, J. P., What is Literature, trans. Bernard Frechtman, Methuen, London, 1950, p. 45. See
also FORREST, P., “Collective Responsibility and Restitution,” Philosophical Papers, v. 27, no. 2,
(1998), pp. 79-91.
40
Cf. SARTRE, J. P., Anti-Semite and Jew, trans. George J. Becker, Schocken Books, New York,
1965, p. 90; M AY, L., Sharing Responsibility, pp. 146-151. See also JASPERS, K., The Question of
German Guilt, p. 74; A DAMS, R., “Involuntary Sins,” Philosophical Review, v. 94, (1985), pp. 3-27;
and M ANGIN, M., “Character and Well Being,” Philosophy and Social Criticism, v. 26, no. 2 (2000),
pp. 79-98.
41
A RISTOTLE, Nicomachean Ethics, Edition by Terence Irwin, Hackett, Indianapolis (IN), 1985,
Book III, Chapter 5.
120 Science, Technology and Society: A Philosophical Perspective
45
Cf. COLWELL, R., “Natural and Unnatural History,” in SHEA, W. and SITTER, B. (eds), Scientists
and Their Responsibility, Watson, Canton, 1989, p. 17; and VON HIPPEL, F. and PRIMACK, J., “Public
Interest Science,” Science, v. 117, no. 4055, (1972), p. 1169.
46
YAMEK, G., “Pop Musicians Boycott Promotion,” British Medical Journal, v. 322, no. 7280,
(2001), p. 191. See also EXETER, P., “Campaigners for Breast Feeding Claim Partial Victory,” British
Medical Journal, v. 322, no. 7280, (2001), p. 191.
47
Cf. M AY, L., Sharing Responsibility, pp. 142-145.
48
Cf. BARNES, D. and BERO, L., “Why Review Articles on the Health Effects of Passive Smoking
Reach Different Conclusions,” Journal of American Medicine Association, v. 279, (1998), pp.
1566-1570.
49
Cf. ROCHON, P., GURWITZ, J., SIMMS, R., FORTIN, P., FELSON, D., MINAKER, K., and CHALMERS,
T., “A Study of Manufacturer-supported Trials of Nonsteroidal Anti-inflammatory Drugs in the
Treatment ofArthritis,” Archives of Internal Medicine, v. 157, (1994), pp. 157-163.
122 Science, Technology and Society: A Philosophical Perspective
public interest, such as misuse of science and technology. One of the main
reasons for professionals’ duties to society is citizens’ related rights to free
informed consent to decisions affecting their welfare. To help ensure this consent,
professionals must communicate openly with the public, especially regarding
science and technology.52 If they do not, industrial and political leaders can “get
away with” whatever they wish. Scholars and other professionals also have duties
to help reform science-related institutions, of which they are a part, because their
economic, political, and intellectual power helps control much of what happens in
society.53 Given a sophisticated, technocratic society; given professionals’ special
knowledge; and given their near-monopoly over their intellectual services,
professionals’ great power “enlarges the significance of sins of omission.” 54
Virtually all codes of professional ethics also recognize a responsibility for
the common good.55 The American Institute of Biological Sciences, or AIBS
code, for example, requires biologists to expose fraud, professional misconduct,
conflicts of interest and to promote open exchange.56 And in most codes, public
responsibilities receive the highest priority, in part because they are enjoined by
role responsibilities.57 Just as parents, medical doctors, teachers, and so on, have
certain responsibilities in society, by virtue of their roles, so also professionals
have special responsibilities because of their roles and their corresponding
trusteeship duties to society.
Even apart from such roles, duties to protect society (by helping to reform
science and technology) are part of the Good Samaritanism required of all
citizens. In the 1800s, Portugal, the Netherlands, and Italy had laws requiring
citizens to undertake the “easy rescue” of others. After 1900, Norway, Russia,
Turkey, Denmark, Poland, Germany, Romania, France, Hungary, Czechoslovakia,
Belgium, Switzerland, and Finland also added similar statutes. In striking
contrast, says Joel Feinberg, English-speaking countries have remained apart
from the European consensus. They have not punished even harmful omissions
of an unethical kind. He believes the Europeans are right, that people ought to be
52
Cf. A MERICAN ASSOCIATION FOR THE A DVANCEMENT OF SCIENCE (AAAS), Principles of Scientific
Freedom and Responsibility, Revised Draft, AAAS, Washington DC, 1980, pp. 1 and 6. Some of this
discussion of professional responsibility is based on SHRADER-FRECHETTE, K., Ethics of Scientific
Research, pp. 64-67.
53
Cf. BAYLES, M., Professional Ethics, p. 4. See WUESTE, D. (ed), Professional Ethics and Social
Responsibility, Rowman and Littlefield, Lanham, 1994.
54
CAMENISCH, P., “On Being a Professional, Morally Speaking,” in BAUMRIN, B. and FREEDMAN, B.
(eds), Moral Responsibility and the Professions, Haven Press, New York, 1982, p. 43.
55
Cf. BAYLES, M., Professional Ethics, pp. 94, 109; See A MERICAN SOCIETY OF BIOLOGICAL
CHEMISTRY (ASBC), Bylaws, American Society of Biological Chemistry, 1977; NATIONAL SOCIETY
OF PROFESSIONAL ENGINEERS (NSPE), “Criticism of Engineering in Products, Board of Ethical
Review, Case No. 67.10,” in BAUM, R. and FLORES, A. (eds.), Ethical Problems in Engineering, pp.
64-72; and OFFICE OF GOVERNMENT ETHICS, Standards of Ethical Conduct for Employees of the
Executive Branch: Executive Order 12674-Principles of Ethical Conduct for Government Officers
and Employees, Internal Revenue Service, Washington, DC, 1993, p. 35042.
56
Cf. SHRADER-FRECHETTE, K., Ethics of Scientific Research, pp. 42-44, 72 and 78-84.
57
Cf. SHRADER-FRECHETTE, K., Ethics of Scientific Research, pp. 63-80.
124 Science, Technology and Society: A Philosophical Perspective
If the previous arguments are correct, one of the most urgent evils in society
is that, through our misuse of science and technology, we are both killing people
and misleading them about the causes of these fatalities. Particulates from fossil
fuels, alone, cause hundreds of premature global deaths, mostly among the most
vulnerable, like children. Yet those of us, who do not read government and medical
reports, do not realize the consequences of our burning fossil fuels.
Reforming science and technology, however, does not require us to espouse
environmentalism or any other “ism,” but simply to become informed, to teach
our students to become informed, to help level the playing field of science, to
challenge those who cover up or manipulate scientific findings. Once deliberative
democracy corrects these procedural injustices in the performance and use of
science, and once people become active in creating the democracy they deserve,
democratic procedure will destroy many substantive problems, such as pollution-
induced deaths. Ideologies, of any sort, cannot alone make them disappear.
One way to help level the scientific playing field, to help reform science and
technology, is to expose the poor science of researchers who are merely well-
funded “front groups.” The Global Climate Coalition, like the Advancement of
Sound Science Coalition, is a front group funded by the oil, automobile, chemical,
and tobacco industry to oppose signing the Kyoto Accords. Responsible Industry
for a Sound Environment is a pesticide-industry-funded group writing to discredit
right-to-know provisions in pesticide regulations. The Forest Protection Society
is funded by the logging industry to promote rainforest logging. The Wetlands
Coalition, funded by the oil and gas industry, has a logo that shows a duck flying
over a wetland, but it lobbies and writes in favor of wetlands oil and gas drilling.
In 1991, for example, Dow Chemical contributed to 10 anti-regulatory front
groups, including the American Council on Science and Health and Citizens for a
Sound Economy. Chevron, Exxon, Mobil, DuPont, Amoco, Ford, Philip Morris,
Pfizer, Monsanto, and Proctor and Gamble all contribute millions each year to
anti-regulatory, corporate front groups that pose as populist movements and that
manipulate science to make their case. If Burger King said that Whoppers were
nutritious and helped prevent heart attacks, the public might not listen. But if
the American Council on Science and Health, an industry-funded front group,
claimed its experts made this point, people might believe it, especially if they did
not know who funds the council. Using scientific-sounding names, such front
groups publish books and pamphlets arguing that pesticides do not cause cancer,
that global warming is a myth, and that saccharin is not dangerous.64
Of course, when citizens challenge biased or profit-driven science, wealthy
private interests sue them. Such harassing lawsuits have become so common that
they have a name, “SLAPPs,” “Strategic Lawsuits Against Public Participation.”
When two Londoners, Dave Morris and Helen Steel, distributed pamphlets arguing
64
See note 3.
126 Science, Technology and Society: A Philosophical Perspective
65
Cf. BEDER, S., Global Spin: The Corporate Assault on Environmentalism, pp. 63-74.
66
Cf. BEDER, S., Ibidem, p. 215.
67
SLATER, D., “The big book of Bush,” Sierra, v. 87, (2002), pp. 37-47.
68
BAILEY, R., Ecoscam, St. Martin’s Press, New York, 1994.
69
Cf. MOORE, C., “Rethinking the Think Tanks,” Sierra, v. 87, (2002), pp. 56-59 and 73.
70
Cf. MOORE, C., “Rethinking the Think Tanks,” pp. 56-59 and 73.
How to Reform Science and Technology 127
7. CONCLUSION
If we cannot count on politicians, legislators, corporations, NGOs, and courts
to achieve balance and objectivity in doing, reporting, and using science, then
those of us who know science, logic, and ethics must do so. Reforming science
-that is, separating it from complete control by moneyed, private interests, is
difficult only because so few of us take on the task of reform. Ralph Nader
defined a real democracy as “a society where less and less courage and risk
are needed of more and more people to spread justice.” 71 If everyone does the
work of democracy, then any one of us needs less courage to face the risks that
democracy demands.
71
NADER, R., “Introduction,” in ISAACS, K., Civics for Democracy, p. vi.
128 Science, Technology and Society: A Philosophical Perspective
8. BIBLIOGRAPHY
A BBARNO, J., “Role Responsibility and Values,” Journal of Value Inquiry, v. 27,
n. 3/4, (1993), pp. 305-316.
A DAMS, R., “Involuntary Sins,” Philosophical Review, v. 94, (1985), pp. 3-27.
AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE (AAAS), Principles of
Scientific Freedom and Responsibility, Revised Draft, AAAS, Washington, DC, 1980.
A MERICAN SOCIETY OF BIOLOGICAL CHEMISTRY (ASBC), Bylaws, American
Society of Biological Chemistry, 1977.
ANSOLABEHERE, S., SNYDER JR., J., and TRIPATHI, M., “Are PAC Contributions and
Lobbying Linked? New Evidence from the 1995 Lobby Disclosure Act,” Business
and Politics, v. 4, no. 2, (2002), pp. 131-156.
A RISTOTLE, Nicomachean Ethics, Edition by Terence Irwin, Hackett, Indianapolis
(IN), 1985.
A RLACCHI, P., Slaves: The New Traffic in Human Beings, Rizzoli, Milan, 1999.
BAILEY, R., Ecoscam, St. Martin’s Press, New York, 1994.
BARNES, D. and BERO, L., “Why Review Articles on the Health Effects of Passive
Smoking Reach Different Conclusions,” Journal of American Medicine Association,
v. 279, (1998), pp. 1566-1570.
BAYLES, M., Professional Ethics, Wadsworth, Belmont, 1981.
BEDER, S., Global Spin: The Corporate Assault on Environmentalism, Chelsea
Green Books, White River Junction, 2002.
BOX-STEFFENSMEIER, J. and GRANT, J., “All in a Day’s Work: The Financial
Rewards of Legislative Effectiveness,” Legislative Studies Quarterly, v. 25, no. 4,
(1999), pp. 511-524.
CAMENISCH, P., “On Being a Professional, Morally Speaking,” in BAUMRIN, B. and
FREEDMAN, B. (eds.), Moral Responsibility and the Professions, Haven Press, New
York, 1982, pp. 42-61.
COLWELL, R., “Natural and Unnatural History,” in SHEA, W. and SITTER, B. (eds.),
Scientists and Their Responsibility, Watson, Canton, 1989, pp. 1-40.
CONDREAN, C., “Sidney Godolphin and the Free Rider,” Business and Professional
Ethics Journal, v. 17, n. 4 (1998), pp. 5-19.
DE RUYTER, D., “The Virtue of Taking Responsibility,” Educational Philosophy
and Theory, v. 34, n. 1, (2002), pp. 25-35.
DRAZEN, J., “Institutions, Contracts, and Academic Freedom,” New England
Journal of Medicine, v. 347, (2002), pp. 1362-1363.
EDMONDSON, G. ET. AL., “Workers in Bondage,” Business Week, n. 3709, (2000),
pp. 146-155.
ELLIOT, D., “Renewable Energy and Sustainable Futures,” Futures, v. 32, n. 3/4
(2000), pp. 261-274.
How to Reform Science and Technology 129
EPSTEIN, S., The Politics of Cancer Revisited, Easts Ridge Press, Fremont Center,
1998.
ERMAN, D. and PISTER, E., “Ethics and the Environmental Biologist,” Fisheries,
v. 14, n. 2, (1989), pp. 4-7.
EXETER, P. B., “Campaigners for Breast Feeding Claim Partial Victory,” British
Medical Journal, v. 322, n. 7280, (2001), p. 191.
FEINBERG, J., Doing and Deserving, Princeton University Press, Princeton, 1970,
p. 248.
FEINBERG, J., “The Moral and Legal Responsibility of the Bad Samaritan,”
Criminal Justice Ethics, v. 3, n. 1, (1984), pp. 55-66.
FORREST, P., “Collective Responsibility and Restitution,” Philosophical Papers, v.
27, n. 2, (1998), pp. 79-91.
GELOBTER, M., “Have Minorities Benefited? A Forum,” EPA Journal, v. 18, n. 1,
(1992), pp. 32-36.
GOLDBERG, K., “Efforts to Prevent Misuse of Pesticides Exported to Developing
Countries,” Ecology Law Quarterly, v. 12, n. 4, (1985), pp. 1025-1051.
GEWIRTH, A., Human Rights, The University of Chicago Press, Chicago, 1982.
GOULET, D., The Cruel Choice, Atheneum, New York, 1971.
H ARDIN, G., “The Tragedy of the Commons,” Science, v. 162, (1968), pp. 1243-
1248.
H ART, J., “Hannah Arendt: The Care of the World and of the Self,” in DRUMMOND,
J. (ed.), Phenomenological Approaches to Moral Philosophy, Kluwer, Dordrecht,
2002, pp. 15-45.
INTERNATIONAL COMMISSION ON R ADIOLOGICAL PROTECTION (ICRP), A Framework
for Assessing the Impact of Ionizing Radiation on Non-Human Species, reference 02-
305-02, ICRP, Vienna, 2003.
INTERNATIONAL COMMISSION ON R ADIOLOGICAL PROTECTION (ICRP),
Recommendations, ICRP publication 60, Pergamon, Oxford, 1991.
ISAACS, K., Civics for Democracy, Essential Books, Washington, DC, 1992.
JASPERS, K., The Question of German Guilt, trans. E. Ashton, Capricorn Books,
New York, 1961.
K ITCHER, PH., Science, Truth, and Democracy, Oxford University Press, New
York, 2001.
LADD, J., “Philosophical Remarks on Professional Responsibility in
Organizations,” in BAUM, R. and FLORES, A. (eds.), Ethical Problems in Engineering,
Center for the Study of the Human Dimensions of Science and Technology, Troy
(NY), 1980, pp. 193-194.
LEWIS, H., “The Non-Moral Notion of Collective Responsibility,” in FRENCH, P.
(ed.), Individual and Collective Responsibility, Schenkman Books, Rochester, 1972,
pp. 119-131.
130 Science, Technology and Society: A Philosophical Perspective
LILIENFELD, A., LEVIN, M. and K ESSLER, I., Cancer in the United States, Harvard
University Press, Cambridge, 1972.
LOOMIS, T., “Indigenous Populations and Sustainable Development,” World
Development, v. 28, n. 5, (2000), pp. 893-910.
MANGIN, M., “Character and Well Being,” Philosophy and Social Criticism, v. 26,
n. 2, (2000), pp. 79-98.
MARGOLIS, J. “On the Ethical Defense of Violence Destruction,” in HELD, V.,
NIELSEN, K. and PARSONS, C. (eds.), Philosophy and Political Action, Oxford
University Press, N. York, 1972, pp. 52-71.
MARTIN, M., Meaningful Work: Rethinking Professional Ethics, Oxford University
Press, New York, 2000.
MATHEWS, J., World Resources 1986, Basic Books, New York, 1986.
MAY, L., The Morality of Groups, University of Notre Dame Press, Notre Dame
(IN), 1987.
MAY, L., Sharing Responsibility, The University of Chicago Press, Chicago,
1992.
MCGINN, A., “Phasing Out Persistent Organic Pollutants,” in BROWN, L., FLAVIN,
C. and FRENCH, H. (eds), State of the World 2000, Norton, New York, 2000, pp. 103-
133.
MELLEMAN, G., Collective Responsibility, Rodopi, Amsterdam, 1997.
MILLER, S., “Collective Responsibility,” Public Affairs Quarterly, v. 15, n. 1,
(2001), pp. 65-82.
MOORE, C., “Rethinking the Think Tanks,” Sierra, v. 87, (2002), pp. 56-59.
MUSIL, R., “Political Science on Federal Advisory Panels,” Physicians for Social
Responsibility Reports, v. 24/25, (2003), p. 3.
NADER, R., “Introduction,” in ISAACS, K., Civics for Democracy, Essential Books,
Washington, DC, 1992, pp. v-vi.
NATHAN, D., and WEATHERALL, D., “Academic Freedom in Clinical Research,”
New England Journal of Medicine, v. 347, (2002), pp. 1368-1370.
NATIONAL R ESEARCH COUNCIL (NRC), Carcinogens and Anticarcinogens in the
Human Diet, National Academy Press, Washington, DC, 1996, p. 355.
NATIONAL SOCIETY OF PROFESSIONAL ENGINEERS (NSPE), “Criticism of
Engineering in Products, Board of Ethical Review, Case No. 67.10,” in BAUM, R.
and FLORES, A. (eds.), Ethical Problems in Engineering, Center for the Study of the
Human Dimensions of Science and Technology, Troy (Ny), 1980, pp. 64-72.
OFFICE OF GOVERNMENT ETHICS, Standards of Ethical Conduct for Employees
of the Executive Branch: Executive Order 12674-Principles of Ethical Conduct for
Government Officers and Employees, Internal Revenue Service, Washington, DC,
1993, p. 35042.
PARFIT, D., Reasons and Persons, Oxford University Press, New York, 1984.
How to Reform Science and Technology 131
PAUL, E., MIKLLER, F. and PAUL, J., The Welfare State, Cambridge University
Press, New York, 1997.
POPPER, K., Conjectures and Refutations, Routledge and K. Paul, London, 1963
(3rd ed. revised, 1969).
POSTEL, S., “Carrying Capacity: Earth’s Bottom Line,” in BROWN, L. ET AL. (eds.),
State of the World 1994, Norton, New York, 1994, pp. 3-21.
R AIKKA, J., “On Disassociating Oneself from Collective Responsibility,” Social
Theory and Practice, v. 23, n. 1, (1997), pp. 93-108.
R EPETTO, R., Paying the Price: Pesticide Subsidies in Developing Countries,
Research Report Number 2, December 1985, World Resources Institute, Washington,
DC, 1985.
R ESCHER, N., “Collective Responsibility,” Journal of Social Philosophy, v. 29, n.
3, (1998), pp. 44-58.
ROCHON, P., GURWITZ, J., SIMMS, R., FORTIN, P., FELSON, D., MINAKER, K. and
CHALMERS, T., “A Study of Manufacturer-supported Trials of Nonsteroidal Anti-
inflammatory Drugs in the Treatment of Arthritis,” Archives of Internal Medicine, v.
157, (1994), pp. 157-163.
SARTRE, J. P., What is Literature, trans. Bernard Frechtman, Methuen, London,
1950.
SARTRE, J. P., Anti-Semite and Jew, trans. George J. Becker, Schocken Books,
New York, 1965.
SCHNIEDER, T., “Lighting the Path to Sustainability,” Forum for Applied Research
and Public Policy, v. 15, no. 2, (2000), pp. 94-100.
SCHWAB, J., Deeper Shades of Green, Random House, N. York, 1994.
SHAEFFER, E., “Power Plants and Public Health,” Physicians for Social
Responsibility Reports, v. 34, (2002), pp. 3-4.
SHRADER-FRECHETTE, K., Environmental Ethics, Boxwood Press, Pacific Grove
(CA), 2nd ed., 1991.
SHRADER-FRECHETTE, K., Risk and Rationality, University of California Press,
Berkeley, 1991.
SHRADER-FRECHETTE, K., Ethics of Scientific Research, Rowman and Littlefield,
Savage (MD), 1994.
SHRADER-FRECHETTE, K., “Science versus Educated Guessing,” BioScience, v. 46,
n. 7, (1996), pp. 488-489.
SHUE, H., “Exporting Hazards,” in BROWN, P., and SHUE, H. (eds), Boundaries:
National Autonomy and Its Limits, Rowman and Littlefield, Totowa, 1981, pp. 107-145.
SINGER, P. (ed.), Applied Ethics, Oxford University Press, Oxford, 1986.
SLATER, D., “The big book of Bush,” Sierra, v. 87, (2002), pp. 37-47.
STEIN, N., “No Way Out,” Fortune, v. 147, n. 1, (2003), pp. 102-107.
132 Science, Technology and Society: A Philosophical Perspective
Anna Estany
1. DESIGN SCIENCE
Design sciences are the result of the “scientification” and mechanization
of the arts, in relation to skills and practical activities. Herbert Simon in The
Sciences of the Artificial (1969) points out that the traditional model of science
gives a misleading picture of fields such as engineering, medicine, business,
architecture, painting, planning, economics, education, etc. which are concerned
with “design,” understood as objective, proposal or aim to be achieved, that is
to say, not with how things “are,” but with “how things ‘ought to be’ in order to
attain specific goals.
Engineers are not the only professional designers. The intellectual activity
involved in producing material artifacts is not really that different from that of
prescribing a cure for a patient, or that of drawing up a program of a new sales
plan for a company, or that of a social welfare program. Constructed in this way,
design is the nucleus of professional training; it is the main characteristic which
distinguishes the professions from the sciences. The schools of engineering as
well as the schools of law, architecture, education, medicine, etc. revolve around
the process of design.
135
136 Science, Technology and Society: A Philosophical Perspective
Simon maintains that not only is the science of design possible but that it
emerged in the mid-seventies. (In 1975 The Carnegie Mellon University founded
the “Design Research Center,” whose name was changed to “Engineering Design
Research Center” in 1985.) As a result, a substantial body of both theoretical and
empirical knowledge, which deals with the components of the theory of design
and their interrelationship, exists today.
Ilkka Niiniluoto has taken up Simon’s idea in his analysis of the objectives
and structure of design sciences.1 He makes a distinction between descriptive
sciences (which describe how the world is), design sciences (which transform the
world) and technology (which constructs artifacts). The structure of formulations
is one of the elements which distinguishes one science from another. In the
descriptive sciences the formulations take the form of “A causes B” or, in the
case of stochastic systems “A causes B with the probability of p.” In the design
sciences the formulations take the form of “If you want to achieve A and you are
at B you have to perform C,” that is to say, the formulations are practical norms
also known as “praxiological statements.”
1
Existing Knowledge
Powers of
Reflective Observation
Hypothesis
3
Confirmation, Modification or
Rejection
Proof
140 Science, Technology and Society: A Philosophical Perspective
1 1
Scientific Input
State of Recognition Nontechnical
1. Basic and
applied the Art of Need Inputs
research 1. Economic
2. Broad Powers 2. Social
scientific of 3. Geopolitical
concepts Synthesis
Concept
Design
Stage
Failures
2
Revised Powers
Technical Concept of Design
Market
Acceptance Acceptance
3
Powers of
Development
4
Progress and Social Impact in Design Sciences 141
There is a final question which should not be ignored and which is the rational
basis of evaluation systems. Wojick compares evaluation systems to Kuhn’s
paradigms and also considers them to be incommensurable although it should be
pointed out that it is not a good idea to consider them incommensurable, at least
in terms of Kuhn’s primitive meaning of paradigms. In fact, if we heed Wojick’s
comments, it does not follow that evaluation systems are incommensurable.
Wojick makes reference to the knowledge afforded by ecology, that is, the
environmental cost, as the reason for calling into question the construction
of reservoirs. It is true that these environmental costs were not recognized by
previous generations however, it may be said that this was so because the harmful
consequences were unknown. The same may be said of food additives. This does
not mean that there are universal spatial-temporal values however, neither does it
mean that there are values which transcend systems of technological evaluation
specific to a design science.
Echeverría advocates axiological pluralism and points to different kinds
of values (basic, epistemological, technological, economic, military, political,
legal, social, ecological, religious, aesthetic and moral) which play a role in
technoscientific activity and which distinguish it from science of other historical
periods. The influence exercised by all these factors is undeniable. The question
lies in the weight of these factors in the whole of the scientific practice on the one
hand, and in the newness of the phenomenon, that is, if it is really new or if it is
simply a question of degree, on the other.
team who facilitated publication, thus allowing him to publish his article with
unusual speed. It would seem that for the great majority of scientists publication
is a long and arduous process whereas for others the process might be compared
to a rally driver on an unusually quiet four-lane motorway. Moreover, we are not
just talking about favor; we are also talking about deliberate efforts to impede
publication. Not content with accelerating the publication process, Vogelstein had
both a direct and indirect part to play in the delay in publication of Perucho’s
article. Whether this was due to an explicit desire to block the article so as to
gain time to compile his own results or whether it was due to a critical attitude
on the part of Vogelstein towards a colleague, which in turn might be due either
consciously or unconsciously to a competitive and territorial attitude, nevertheless
questionable, is unclear as we do not have access to objective data.
The fact that such factors exist is a clear obstacle to progress within the
scientific community. In science, where publication is the only means by which
to climb the ladder, any control of access to publication and blocking of other
researchers’ publications converts it into an inherently unjust system, in which
nationality or institution affiliation are only accentuated.
Above all, the existence of such impediments holds back progress in science.
If a key article for the development of cancer research such as that of Perucho is
in limbo for one year while another researcher takes all the credit then it is not just
the researcher who is compromised but also the whole scientific community. The
role of the mutant phenotype was present in literature but nobody was working on
it. However, cancer research changed radically with the publication of Perucho’s
and Vogelstein’s articles and in just a matter of weeks new articles, developing
upon the above researchers’ ideas, started to appear. Now these ideas are applied
in all kinds of research, including practical applications of the discovery such as
how to identify genes which have a predisposition to developing certain forms of
cancer. Approximately a thousand articles have been published to date on mutant
genes and genetic instability since the groundbreaking publication in 1993. This
fact confirms the originality of the discovery, which has lead to the emergence of
a new field with its own terminology ( mutant genes, instability of microsatellites,
etc.), and the enormous impact on cancer research in terms of the implications for
the basic underlying mechanisms of certain tumors.
Here the idea of a deontological code of practice is fundamental, one which
regulates the internal workings of scientific communities so as to eradicate
falsification of data, plagiarism, etc. Many of the problems encountered by Perucho
could have been avoided if the code of practice had been complied with.
relationship between pure sciences and design sciences, in which external factors
are brought to light in the debate over the place of cancer research. A study carried
out by a group of German scientists and philosophers in the 1970s and 1980s on the
relationship between basic and medical research in cancer research in Germany is
of particular relevance to this debate. Hohlfeld interviewed 29 scientists working
in the fields of molecular biology (basic research), experimental cancer research,
clinical cancer research and cancer and epidemiological medicine from 1975
to 1976. The study highlights conflicts between theoretical, experimental and
practical traditions in scientific research.
Basic research scientists, in this case molecular biology, hold that science
cannot be driven by political aims. One scientist interviewed said:
“Basic research, in particular cell biology, must generate the necessary
knowledge before any real breakthrough can be expected to occur. Impatience,
no matter how justified and understandable on the part of millions of cancer
patients, should not make either scientific organizations or politicians adopt
measures which ultimately swallow up vast amounts of money without bringing
any real success.” 10
This way of thinking reduces the problem of cancer to key events in biological
processes which must be explained by molecular theories. As a consequence,
progress here is conceived in terms of strictly epistemological values and not in
terms of the cure or otherwise of cancer sufferers.
The first step towards the application of theories is “experimental cancer
research,” which is a type of research located behind the frontiers of “true”
science and structured by still unsolved fundamental theoretical questions. It is not
determined by the internal dynamics of scientific advance but by goal orientation
or the solution of certain problems. Scientists working in this field share with basic
scientists the idea that health problems must be solved by scientific instruments
on the basis of clarification of underlying biological mechanisms and that this in
turn requires “high technology.” They also share the idea that experts are needed
in the field. The fundamental difference between them is that experimental cancer
researchers, unlike basic scientists, are motivated by the goal that their research
must benefit humanity. Experimental cancer researchers expect to find a cure for
the disease although they also explain strictly biological processes.
When intrinsic scientific motivation combines with external goal orientation
in a given field in which there are as yet unresolved theoretical questions this
results in a kind of research known as applied research. Scientists doing this type
of research try to apply the results of basic research to the clinical. This would be
the case of cancer researchers whose loyalties are split between molecular biology
10
HOHLFELD, R., “Two Scientific Establishments which Shape the Pattern of Cancer Research in
Germany: Basic Science and Medicine”, in ELIAS, N., M ARTINS, H. and WHITLEY, R. (eds.), Scientific
Establishments and Hierarchies. Sociology of Sciences, vol. VI, Reidel, Dordrecht, 1982, p. 151.
148 Science, Technology and Society: A Philosophical Perspective
people who are already sick. Difficulties arise in this field primarily because of
the political and social consequences of the results of this type of study.
We might venture to say that the difficulties and tensions within cancer
research mainly arise from the fact that different sciences, both pure and design,
are involved. Amongst the design sciences involved in the treatment of cancer
are medicine, pharmacology and nursing. Design consists of drugs, radiotherapy,
chemotherapy and surgery. The theoretical base is made up by molecular biology,
chemistry and physics. The technical base consists of scanning techniques, that
is, the apparatus used to carry out analyses and nuclear medical processes. The
organizational base consists of health politics, organization of health systems,
new surgical methods and patient care, etc.
All of the above considerations enable us to identify the characteristics of
progress in the different disciplines involved in cancer research. In keeping with
Kotarbinski’s ideas, advance in the cure of this disease will depend on advances
in molecular biology, instruments and techniques employed such as scanners,
chemotherapy, the health system which includes prevention programs for breast
and skin cancer as well as health education in schools, etc.
At present many cancer research scientists are asking themselves the following
questions: Why do we have the feeling that there have been more advances in the
theoretical base than in the technical and organizational base? What might be
the factors that make it impossible to cure the disease despite the fact that we
know many of the mechanisms involved in the development of cancer? The rate
of survival, which in some forms of cancer has increased considerably over the
last few decades, is due more to techniques used to detect cancer in the early
stages than to a real cure once the disease has been diagnosed.
It might be said that what is lacking here is a bridge between theoretical
knowledge and the need which it is trying to satisfy. What is lacking is the design
to meet this need. This is one of the reasons why medicine, a design science, is
not just an applied science. Moreover, this confirms the idea that pure sciences,
understood as descriptive sciences, should not be confused with basic research.
The search for appropriate designs to cure cancer based on existing knowledge is
an integral part of basic research in medicine.
An example of progress in cancer research which is based on a technical
element is the development of a scalpel which is able to detect cancerous cells in a
matter of seconds. According to an article which appeared in El País on 24 March,
2000, scientists working in the Sandia National Laboratories of the Department
of Energy in the United States have developed a scalpel designed to detect the
presence of cancerous cells while the surgeon is removing the tumor obscured
by blood, muscle and fat. The instrument is called “ biological microcavity laser”
and has made it possible to distinguish between cell cultures, composed of normal
brain cells called “astrocites” and its malignant form called “glioblastomas” in
150 Science, Technology and Society: A Philosophical Perspective
the laboratory. This may enable surgeons to eliminate malignant growths with
greater precision and at the same time reduce to a minimum the quantity of
healthy tissue which is removed.
The third base of progress is the organizational one. How might we interpret
this base in relation to cancer research? At the present moment the circulation
of knowledge on a worldwide scale (at least understood as a possibility) and
the differences in life expectancy of cancer patients corresponds to differences
in health politics according to economic, social and political factors. In other
words, if the theoretical and technical bases remain stable then progress in cancer
medicine will be due to changes in the organizational base and to possible action
in the area of health politics.
In an indirect way, environmental programs (politics), which are to a large
extent responsible for environmental factors in the development of cancer, would
also play a part in the progress of cancer medicine. As a last resort, progress in
cancer medicine would depend on the risk management of many of the practices
of today’s society. We may venture to say that the third point made by Kotarbinski
contains all the philosophical issues pertinent to the risk factor even though he
does not specifically refer to this.
8. CONCLUSION
The importance of science in today’s society has lead to many studies on the
relationship between science, technology and society, although most of these tend
to question the rationality of science. Nevertheless, when it comes to identifying
those factors which question scientific rationality very different questions such
as disasters associated with the atomic bomb, transgenic products, bad relations
between scientists in the laboratory, fraud, power structures, economic interest,
suffering on the part of animals used in experiments and many more tend to be
identified. All of these problems are very important and we may say that they
have been caused, in most cases, by scientific development. However, we need
to take stock of the beneficial gains, which are not a few both in terms of health
and disease as well as quality of life when it comes to evaluating science at a
practical level. Even if the evaluation were negative, which I do not believe is
the case, we would need to distinguish between what knowledge of the natural
and social world means and what we can do with this knowledge. This may
appear self-evident however, social constructivists do not accept distinctions of
this kind. Even in the case where for ethical reasons a certain line of research
would have to be stopped, the distinction between knowledge and its use would
still have to be maintained.
152 Science, Technology and Society: A Philosophical Perspective
9. BIBLIOGRAPHY
ASIMOV, M., “A Philosophy of Engineering Design,” in R APP, F. (ed.), Contributions
to a Philosophy of Technology, Reidel, Dordrecht, 1974, pp. 150-157.
AGASSI, J., “The Confusion between Science and Technology in the Standard
Philosophies of Science,” Technology and Culture, v. 7, n. 1, (1966), pp. 348-366.
BEN-DAVID, J., The Scientist’s Role in Society: A Comparative Study, Englewood
Cliffs, Prentice-Hall (NJ), 1971.
BERENBLUM, I., Cancer Research Today, Pergamon, Oxford, 1967.
BLACK, M., “Are There any Philosophically interesting Questions in Technology,”
Philosophy of Science, v. 2, (1976), pp. 185-193.
BÖHME, G., VAN DEN DAELE, W., HOHLFELD, R., K ROHN, W. and SCHÄFER, W.,
Finalization in Science. The Social Orientation of Scientific Progress, Reidel,
Dordrecht, 1983.
BOLAND, M., “Four Categories of Science and Technology Policy and Who Makes
it,” Science, Technology and Society, n. 125, (2000), pp. 1-3.
BRADIE, M., ATTING, T. W. and RESCHER, N. (eds.), The Applied Turn in Contemporary
Philosophy, Bowling Green State University, Bowling Green (OH), 1983.
BRADIE, M. and SAYRE, K., Reason and Decision, Bowling Green Studies in
Applied Philosophy, v. 3, Bowling Green (OH), 1981.
BROAD, W. and WADE, N., Betrayers of the Truth: Fraud and Deceit in the Halls
of Science, Century, London, 1983.
BROOKE, H., Emulation and Invention, New York University Press, New York,
1981.
BROOKS, H., “The Problem of Research Priorities,” Daedalus, v. 107, n. 2, (1978),
pp. 171-190.
BUNGE, M., “Technology as Applied Science,” Technology and Culture, v. 7, n. 1,
(1966), pp. 329-347.
BUNGE, M., “The Philosophical Richness of Technology,” Philosophy of Science,
v. 2, (1976), pp. 153-172.
Progress and Social Impact in Design Sciences 153
JOHNSTON, R., “Finalization: A New Start for Science Policy,” Social Science
Information, v. 15, n. 2/3, (1976), pp. 331-336.
K ITCHER, PH., The Advancement of Science. Science without Legend, Objectivity
without Illusions, Oxford University Press, Oxford, 1993.
K LAHR, D., Exploring Science: The Cognition and Development of Discovery
Processes, foreword by H. Simon, The MIT Press, Cambridge (MA), 2000.
K NORR-CETINA, K. D., The Manufacture of Knowledge: An Essay on the
Constructivist and Contextual Nature of Science, Pergamon Press, Oxford, 1981.
KOTARBINSKI, T., “Praxiological Sentences and How they are Proved,” in
NAGEL, E., SUPPES, P. and TARSKI, A. (eds.), Logic, Methodology and Philosophy.
Proceedings of the 1960 International Congress, Stanford University Press,
Stanford, 1962, pp. 211-223.
KOTARBINSKI, T., Praxiology. An Introduction to the Science of Efficient Action,
Pergamon Press, New York, 1965.
K RANZBERG, M., “The Unity of Science-Technology,” American Scientist, v. 55,
n. 1, (1967), pp. 48-66.
K RANZBERG, M., “The Disunity of Science-Technology,” American Scientist,
56,1, (1968), pp. 21-34.
KUHN, TH. S., The Structure of Scientific Revolutions, The University of Chicago
Press, Chicago, 2nd ed., 1970.
LAKATOS, I., The Methodology of Scientific Research Programmes, Cambridge
University Press, Cambridge, 1978.
LANGRISH, J., GIBBONS, M., EVANS, W. G. and JEVONS, F. R., Wealth from
Knowledge, Macmillan, London, 1972.
LAUDAN, R. (ed.), The Nature of Technological Knowledge: Are Model of Scientific
Change Relevant?, Reidel, Dordrecht, 1984.
LAYTON, E. T. Jr., “Technology as Knowledge,” Technology and Culture, v. 15,
(1974), pp. 31-41.
LENK, H. and ROPOHL, G., “Toward an Interdisciplinary and Pragmatic Philosophy
of Technology: Technology as a Focus for Interdisciplinary Reflection and Systems
Research,” Research in Philosophy and Technology, v. 2, (1979), pp. 15-52.
LESLIE, S. W., “Reestablishing a Conversation in STS: Who’s talking? Who’s
listening? Who cares?,” Bulletin of Science, Technology and Society, v. 19, n. 4,
(1999), pp. 271-280.
LOPEZ CEREZO, J. A. and LUJAN, J. L., Ciencia y Política del riesgo, Alianza
Editorial, Madrid, 2000.
MANTELL, M. I., “Scientific Method. A Triad,” in R APP, F. (ed.), Contributions to
a Philosophy of Technology, Reidel, Dordrecht, 1974, pp. 115-123.
MARQUIS, D. G. and A LLEN, T. J., “Communications Patters in Applied
Technology,” The American Psychologist, v. 21, (1966), pp. 1052-1060.
156 Science, Technology and Society: A Philosophical Perspective
SAYRE, K. M., “Pure and Applied Reason,” Reason and Decision, v. 3, (1981), pp.
1-13.
SCHÄFER, W. (ed.), Finalization of Science: The Social Orientation of Scientific
Progress, Reidel, Dordrecht, 1983.
SHIMKIN, M. B., “History of Cancer Research: A Starter Reading List and Guide,”
Cancer Research, v. 34, (1974), pp. 1519-1520.
SIMON, H., The Sciences of the Artificial, 3rd ed., The MIT Press, Cambridge
(MA), 1996 (1st ed., 1969).
SINTONEN, M., “Basic and Applied Sciences: Can the Distinction (still) be
Drawn?,” Science Studies, v. 3, n. 2, (1990), pp. 23-31.
SKOLIMOWSKI, H., “The Structure of Thinking in Technology,” Technology and
Culture, v. 7, n. 3, (1966), pp. 371-383.
SKOLIMOWSKI, H., “On the Concept of Truth in Science and in Technology,” in
Akten des XIV Internationalen Kongresses für Philosophie, Herder, Vienna, 1968,
pp. 553-559.
SPIEGEL-RÖSING, I., “The Study of Science, Technology and Society (SSTS):
Recent Trends and Future Challenges,” en SPIEGEL-RÖSING, I. and PRICE, DEREK DE
SOLLA (eds.), Science, Technology and Society. A Cross-disciplinary Perspective,
SAGE Publications, London, 1977, pp. 7-42.
STAPLETON, D. H., “Technology and Science in Tony Hillerman’s Novels,” Science,
Technology and Society, n. 115, (1998), pp. 1-3.
THAGARD, P., “Beyond Utility Theory,” Reason and Decision, v. 3, (1981), pp.
42-49.
TONDL, L., “On the Concepts of ‘Technology’ and ‘Technological Sciences’,” in
R APP, F. (ed.), Contributions to a Philosophy of Technology, Reidel, Dordrecht, 1974,
pp. 1-18.
TOULMIN, S., “Innovation and the Problem of Utilization,” in GRUBER, W. H. and
MARQUIS, D. G. (eds.), Factors in the Transfer of Technology, The MIT Press, Boston,
1969, pp. 24-38.
TUPLIN, W. A., “The Role of Experiments in Applied Science,” in R APP, F. (ed.),
Contributions to a Philosophy of Technology, Reidel, Dordrecht, 1974, pp. 191-193.
WEINBERG, A. M., Reflections on Big Science, Pergamon, Oxford, 1967.
WEINBERG, R. A., Racing to the Begining of the Road: The Search for the Origin
of Cancer, W. H. Freeman, N. York, 1998.
WOJICK, D., “Philosophy of Technology and the Structure of Technological
Revolutions,” in BUGLIARELLO, G. and DONER, D. B. (eds.), The History and Philosophy
of Technology, University of Illinois Press, Champaign (IL), 1979, pp. 238-261.
ZIMAN, J., An Introduction to Science Studies, Cambridge University Press,
Cambridge, 1984.
EXPERIMENTS, INSTRUMENTS AND SOCIETY:
R ADIOISOTOPES IN BIOMEDICAL R ESEARCH
The purpose of this essay is to show an epistemic and cultural process that
played a part in the invention and use of new instruments: that of the use of
radioisotopes in biomedical research. Incorporating the values of the era in which
the use of radioisotopes were designed and promoted, the production of these
at an industrial scale and their “blackboxing” finally took place, and they have
become useful utilities which are productive through the results they provide.
Studies on the invention and use of experiments show how the production of
scientific knowledge, at the bench in the laboratory and by a community of experts,
is socially embedded. While producing knowledge, experiments and instruments
contribute at the same time to the construction of scientific expertise.
A given technique is part of the much broader context in which it is handled
and of a wider culture than that of the person who becomes skilled in its design
and tuning. It usually incorporates social and political practices by analogy. Thus
it is not only that the instrument or the technique becomes part of the society in
which it is used, not only that it shares with its environment norms and values, but
that the social values of the environment in which the experiment is carried out
penetrate into the design of a given device as such, and that these values intervene
in the knowledge produced and reproduced by a given set of techniques in a so-
called experiment.1
Norton Wise suggested considering the technical devices as mediating
machines, mediating agents between scientific knowledge and its cultures, to
show the mechanisms by which social and cultural values are embedded into
knowledge, and the ways by which knowledge emerges in such cultures and
values. Later methodological reconstructions have situated instruments and
experiments as part of the knowledge they produce and disseminate, considering
the instruments as scientific knowledge as such and not only mediating machines.
This consideration leads to a blurred distinction between science and technology,
as both would be part of the same box of learning. In this approach, techniques
as an application would be meaningless, or at least an incomplete supposition,
since the machine or the device of any size, no matter whether in design or
by its “blackboxing” use, produces and reproduces phenomena that become
1
Cf. WISE, M. N., “Mediating Machines,” Science in Context, v. 2, (1988), pp. 77-113; SHAPIN, S.
and SHAFFER S., Leviathan and the Air Pump: Hobbes, Boyle, and the Experimental Life, Princeton
University Press, Princeton, 1979; LATOUR, B. and WOOLGAR S., Laboratory Life: The Social
Construction of Scientific Facts, Sage, Beverly Hills, 1979.
159
160 Science, Technology and Society: A Philosophical Perspective
blurred distinction between nature and experiments at the bench configures the
contemporary social order. This social order is based on this blurred distinction and
is at the basis of decision-making. This decision-making is based on the scientific
expertise of different groups of people specialised in a given field, whose scientific
authority is constructed precisely on their capacity to create and reproduce
knowledge and to make it reliable through the usual dissemination (i.e. papers
published in specialized journals, by peer-review and evaluation processes).
7
Cf. LINDEE, M. S., Suffering Made Real: American Science and the Survivors at Hiroshima, The
University of Chicago Press, Chicago, 1994.
Experiments, Instruments and Society: Radioisotopes in Biomedical Research 163
the more molecularised, the more the phenomena is reduced to reactions between
molecules of known characteristics and known structure, the more reliable
become scientific developments toward a better, more accurate knowledge about
living things. Every form of life is reduced to a small-size phenomena, so small
that a test tube provides enough room for what is considered life production.
From the disturbing success of the more recent techniques and their effects on
everyday life, this paper reviews a recent case from the recent past, without any
certainty, in order to look for a feature which is very overlooked forgotten: the
provisionality of techniques.
Experiments with radioisotopes are an example of this provisionality. A
review of the events that took place from the interwar period concerning their
use in biological and biomedical investigation provides information that allows
us to go into detail about the temporary value of a technique.8 Some controversies
shed light on the topic of the role played by the public in the process of making
knowledge reliable. Social and intellectual discussion on atomic and nuclear
energy, an issue which is at the core of science and technology studies, shows how
public debates, and not only discussions between experts, construct knowledge
while challenging formal scientific expertise.9 Knowledge as well as social order
and cultural values took part in the forging of a given technique. Currently almost
disappeared in laboratories of molecular biology, radioactive isotopes were until
the early 1980s a tool to invent experiments and pose questions, offering a wide
capacity to answer them.
More relevant was their eventual use in medical diagnosis and therapy,
which were being among the early applications of the cyclotron’s products. The
possibility of contributing to medicine was envisaged quite early by Lawrence
in his laboratory at the University of California, Berkeley. These applications
reinforced the interest of cyclotrons as such and widened the possibilities of the
funding of the laboratory.
From the mid-1930s on, the cyclotron and the promising use of its products
in medical research and therapy captivated biologists, medical researchers
and funding agencies. The British physiologist Archibald V. Hill compared its
promising influence with that of the microscope: if with the latter cells were
visible, with isotopes the atoms would be visible within the cells.15
The use of isotopes in biological research began in the 1920s in the US. Rudolf
Schoenheimer used heavy water (in which the hydrogen atom was substituted by its
heavy isotope deuterium) to do research on its effects on organisms and biological
macromolecules.16 During this decade cyclotrons began to be built and research
on radioactive elements was developed and began to be disseminated following
the success of the research done in Paris by Marie and Pierre Curie. A well-known
case is that of the laboratory of Ernest O. Lawrence in Berkeley. According to
Heilbron and Seidel, in the fall of 1933 the cyclotron began to produce so great a
flux of neutrons from berilium through the effect of 3 MeV deuterons that they
began to worry about its “physiological effects” on the researchers themselves. At
this point Lawrence saw that this radiation could have some medical relevance in
cancer therapy. By this time x-rays were no longer considered a promise in cancer
therapy. With this idea in mind he applied for funds to the Macy Foundation with
a project for doing research on them and succeeded in making them more intense
than x-rays or radio rays.17 In 1936 “[the] question [was] of more than theoretical
interest, for it [bore] directly on the possibility of using very fast neutrons in the
treatment of tumors.” 18
In 1935, neutron rays were investigated in the Department of Physiology at
the University of California, Berkeley, and were shown to be considerably more
biologically lethal than x-rays. The issue of the supposed danger of high-voltage
rays had began to be discussed since 1929 but a level of tolerance was not yet
established. Nevertheless, the possibility of harm from radiation introduced
prudence in the Berkeley group, whose members took technical cautions with
the aim of minimizing risks. Precisely these risks brought Lawrence back to his
15
Cf. KOHLER, R. E., Partners in Science: Foundations and Natural Scientists, The University of
Chicago Press, Chicago, 1991, p. 371.
16
Cf. KOHLER, R. E., “Rudolf Schoenheimer, Isotopic Tracers and Biochemistry,” Historical Studies
on Physical Sciences, v. 8, (1977), pp. 257-298.
17
Cf. HEILBRON, J. L. and SEIDEL, R. W., Lawrence and his Laboratory: A History of the Lawrence
Berkeley Laboratory, University of California Press, Berkeley, 1989, p. 357.
18
HEILBRON, J. L. and SEIDEL, R. W., Lawrence and his Laboratory: A History of the Lawrence
Berkeley Laboratory, p. 390.
Experiments, Instruments and Society: Radioisotopes in Biomedical Research 167
Rochester, the Washington University and the University of California with two
others including an E. Lawrence 184-inch machine and a Van de Graaf machine
at MIT). In all these projects medical and biological applications were combined
with “pure physics” objectives.23
eventually leave the funding of the atomic energy and its applications under
the responsibility of the US Atomic Energy Commission (AEC), created in the
aftermath of the war.27
In the meanwhile, when the Manhattan project was already underway in 1943,
there were already many successful developments in the use of radioisotopes
in the United Sates. The interest of the heads of the Rockefeller Foundation in
possible risks to researchers’ health when handling radioactivity contributed to
the creation of a Medical Division of the Manhattan Project. Laboratories at the
universities of Chicago, Rochester (New York), California Berkeley, Columbia,
Washington, Los Alamos and at Clinton Laboratories in Oak Ridge were some
of those that carried out research in health-related risks as part of the Manhattan
Project Medical Division. Among their research was that of the establishment
of the acceptable radiation doses for experimentation. Given earlier results
on leukaemia, research on the mechanisms by which radiation may affect
hematopoietic tissues became a priority. Thus, public health promotion appeared
to acquire strong links with the direct interest of the Manhattan project managers
on radioactive security, while civil applications of atomic energy were perceived
as enormous. Concerned for researchers’ safety, the Manhattan project opened
wide possibilities for the promotion of medicine as an experimental science, at
a time when medical practice was based more on clinical knowledge than on
instrumentation and experimentation, as it late would be.28
When the war ended, experts on research in isotopes applied to biology and
medical therapy envisaged the possibilities that could be opened in this area of
application of the Manhattan Project itself. For this purpose, it seemed necessary
to keep on supporting scientists and technicians involved in this war-time project
through training and research programs. This required the ability to keep the
emergency climate that had featured medical research as related to the war effort,
developed during the war itself.29
Therapeutic uses of radioactive sources, that had been an instrumental
justification for the expenditures on cyclotron construction since the end of
the 1930s, became in the aftermath of WWII an instrumental incentive in the
promotion of scientific and technical contacts between physicists and biologists.
At the end of the war the heads of the Manhattan project were strongly interested
in putting radioisotopes at the disposition of medical research. The US Atomic
Energy Commission assumed, among other responsibilities, and beyond those
27
See the works of A. N. H Creager in notes 11 and 12. On the history of the US AEC, HEWLETT,
R. G. and A NDERSON, O. E., History of the United States Atomic Energy Commission. Vol I, The
New World, 1939/1946; Vol. II, Atomic Shield 1947/1952, The Pennsylvania State University Press,
Pennsylvania, 1962.
28
Cf. LENOIR, T. and H AYS, M., “The Manhattan Project of Biomedicine,” in SLOAN, PH. R. (ed),
Controlling our Destinies. Historical, Philosophical, Ethical and Theological Perspectives on the
Human Genome Project, University of Notre Dame Press, Notre Dame (IN), 2000, pp. 29-62.
29
Cf. LENOIR, T. and H AYS, M., “The Manhattan Project of Biomedicine,” pp. 32-37.
170 Science, Technology and Society: A Philosophical Perspective
related to the military and defence, that of research and development activities
related to the use of the fission products, including radioisotopes for biological
and medical purposes.30
7. CONCLUDING R EMARKS
The growing negative public opinion regarding nuclear energy and the
greater knowledge of the risks of radiation contributed to the later substitution
of radioisotopes by other tracer techniques, based not on radioactivity but on
luminescence and fluorescence. However, radioisotopes created a way of
performing experiments and of making them visible. This performance was,
precisely, the experiment as such. But this technical requirements could hardly
be separated from the designing and performance of the experiment itself. As
it can be suggested that there is no knowledge without techniques, experiments
–that is, exactly what science is considered to be about– and to the same extent
knowledge, are about instruments.
Whether as radioisotopes themselves or as liquid scintillation counters, both
became knowledge. And they contributed to the creation of social order and
values as well. Because in them was embedded the whole ideology of the post-
war era of recycling atomic energy for civilian life. The mutual benefits of this
recycling for both scientists (physicists, biologists and clinicians) and science
policy authorities were at the basis of the knowledge published on biology and
biomedicine during the long second-half of the twentieth century.
This reality of policies, experiments and reliable knowledge constructed scientific
expertise as well as further policy-making (further realities). Embedded in their
post-war culture, radioisotopes, the counters and the US AEC strategies combined
to produce knowledge, techniques and policies. Sciences and techniques could
not be separated from the Cold War climate. During this period, international
cooperation meant sharing tools as well as research problems, where the tools
available were at the core of the process by which research questions were posed.
In the realm of biology, producing life in test tubes that carried radioactivity
transformed the test tube into the small scale operator within the social order of
the radioactivity era.
8. ACKNOWLEDGEMENTS
This study belongs to a wider project on biological disciplines and instruments
supported by the Plan Nacional de Investigación Científica, Desarrollo e
Innovación Tecnológica (BFF2003-09579-C03-03, Spanish National Plan for
Research, Development and Technology Innovation). I would like to thank Anna
Estany, Wenceslao J. Gonzalez, José Luis Luján, Emilio Muñoz, Javier Ordóñez
and Kristin Shrader-Frechette for their useful comments and suggestions on an
174 Science, Technology and Society: A Philosophical Perspective
earlier version of this essay, presented at the Jornadas sobre Ciencia, Tecnología
y Sociedad: La perspectiva filosófica, held in Ferrol (Spain) in March 2004. I also
wish to thank Lori Gerson for her kind editing of the text.
9. BIBLIOGRAPHY
A PPLE, T., Shaping biology. The National Science Foundation and American
Biological Research, Johns Hopkins University Press, Baltimore, 2000.
BRONCANO, F., Mundos artificiales. Filosofía del cambio tecnológico, Paidós,
México D.F., 2000.
CLARKE, A. E. and FUJIMURA, J. (eds.), The Right Tools for the Job: At Work in
Twentieth-Century Life Sciences, Princeton University Press Princeton (NJ), 1992.
CREAGER, A. N. H., “Tracing the Politics of Changing Postwar Research Practices:
the Export of ‘American’ Radioisotopes to European Biologists,” Studies in History
and Philosophy of Biological and Biomedical Sciences, v. 33, (2002), pp. 367-388.
CREAGER, A. N. H., “The Industrialization of Radioisotopes by the US Atomic
Energy Commission,” in GARNDIN, K. and WOORMBS, N. (eds.), Science and Industry
in the 20th Century, Nobel Symposium 123, Watson Publishing, Sagamore Beach
(MA), forthcoming.
CREAGER, A. N. H., The Life of a Virus: Tobacco Mosaic Virus as an Experimental
Model, 1930-1965, The University of Chicago Press, Chicago, 2002.
DICKSON, D., The New Politics of Science, The University of Chicago Press,
Chicago, 1988, 2nd ed.
ECHEVERRIA, J., La revolución tecnocientífica, FCE, Madrid, 2003.
FERREIROS, J. and ORDOÑEZ, J., “Sobre la no neutralidad de los instrumentos
científicos,” in SANTESMASES, M. J., and ROMERO, A. (eds.), La Física y las Ciencias
de la Vida en el siglo XX: Radiactividad y Biología, Ediciones Universidad Autónoma
de Madrid, Madrid, 2003, pp. 13-22.
GOODING, D., PINCH, T. and SCHAFFER, S. (eds.), The Uses of Experiment,
Cambridge University Press, Cambridge, 1989.
H ACKING, I., Representing and Intervening, Cambridge University Press,
Cambridge, 1983.
HEILBRON, J. L. and SEIDEL, R. W., Lawrence and His Laboratory: A History of
the Lawrence Berkeley Laboratory, University of California Press, Berkeley, 1989.
HEWLETT, R. G. and ANDERSON, O. E., History of the United States Atomic Energy
Commission. Vol I, The New World, 1939/1946; Vol. II, Atomic Shield 1947/1952, The
Pennsylvania State University Press, Pennsylvania, 1962.
JASANOFF, S., The Fifth Brach: Science Advisors as Policy-makers, Harvard
University Press, Cambridge (MA), 1990.
K AY, L. E., Who Wrote the Book of Life? A History of the Genetic Code, Stanford
University Press, Stanford, 2000.
Experiments, Instruments and Society: Radioisotopes in Biomedical Research 175
Ramón Queraltó
179
180 Science, Technology and Society: A Philosophical Perspective
1
Cf. HEILBRONER, R. L., “Do Machines Make History?,” Technology and Culture, v. 8, (1967),
pp. 333-345; MISA, TH. J., “Theories of Technological Change: Parameters and Purposes,” Science,
Technology and Human Values, v. 17, (1992), pp. 3-12; and SMITH, M. R. and M ARX, L. (eds.),
Does Technology Drive History? The Dilemma of Technological Determinism, The MIT Press,
Cambridge (MA), 1994. This book is a collective work of special interest. An historical synthesis
about the subject of technological determinism in the US is in M. R. Smith’s article, “Technological
Determinism in the Culture of USA.”
2
Later we will deal with the “weak technological imperative,” whose exact definition will be
propose at this moment.
3
Think for example on an occidental mentality or on an islamic one. The rejection or acceptation of
many technical artifacts depends on it. So, there are islamic countries in which the access to Internet
is restricted and even punished by law. Obviously, the technological change will be very different
in this case.
Philosophical Patterns of Rationality and Technological Change 181
11
For example at the level of social welfare or at the level of other aspects that could be not included
in it, such as an assault on privacy. All this is showed constantly nowadays. See for example: LOPEZ
CEREZO, J. A. and LUJAN, J. L., Ciencia y Política del riesgo, Alianza Editorial, Madrid, 2000;
GERGEN, K., The Saturated Self. Dilemmas of Identity in Contemporary Life, Basic Books, New
York, 1991; SARTORI, G., Homo Videns, Laterza, Roma-Bari, 1997; and BUSTAMANTE, J., Sociedad
informatizada, ¿Sociedad deshumanizada?, Gaia, Madrid, 1993.
12
See QUERALTÓ, R., Ética, Tecnología y Valores en la sociedad global, second part, pp. 159-202.
13
It is the case for example of ELLUL, J., Le bluff technologique, Hachette, Paris, 1987; and ELLUL,
J., Le système technicien, Calman-Levym, Paris, 1977.
186 Science, Technology and Society: A Philosophical Perspective
is actually possible to develop. This should be considered at two levels: (1) Which
demands are technologically practicable and (2) which ones stand out for having
a presumed greater economic benefit when facing a specific social requirement.
Therefore, technological innovation and change are not only produced by a
potential which originates more or less from Technology as a phenomenon, but
rather from the considerations that must be taken into account in technological
praxis as a technoscientific system, in which this cycle of feedback, previously
indicated, is produced.
One can see that these two levels become apparent from the very beginning.
On the one hand, there is the intrinsic tendency within technology to maximize
operative efficacy, which is a common theme throughout the whole of the
technoscientific system. And, on the other hand, there are the demands of
practicability on the part of society. So, from this point of view, i.e. from the
analysis of the impact of economic dimensions on technology, the suggestion of
a tempering force emerges in respect to the technological imperative mentioned
above. Just because something is technologically possible does not mean that it is
going to occur. Rather certain endeavours will be undertaken, or not, in function
of social references and requirements, which are not strictly technological.
For this reason, given the complexity of actual technological praxis, it is
perhaps convenient to propose another formulation of the initial technological
imperative, which we shall call “the weak technological imperative.” This would
state that “anything that is technologically possible to be done will tend to be
done.” Yet this does not mean that it will be done unfallingly. Nuance is of radical
importance here, because now the door to the possibility of social influence on
technological change has been opened. It is not solely resigned to the mercy of its
internal urgings. The weak statement, on the one hand, recognizes the previously
highlighted intrinsic tendency of technology, and on the other hand alerts to
the possibility of influencing such a tendency through other, not exclusively
technological factors.
One reaches a similar conclusion when analyzing the political influence on
technology. This influence is directly intertwined with the current economic
situation, yet it adds some specific circumstances, especially in democratic states.
This is due to the fact that, in effect, in democratic organizations, there is an
element of social control on what the government does (even though, as everyone
knows, there is still a lot of progress to be made in this area). If it is to be true to
the pattern of specifically political rationality that operates today, then here the
sensibility of the politician and of political action in relation to social demands,
must be more pronounced. This means knowing how to keep power and possession
of the government. Not heeding urgent or majority-held social demands could
provoke the loss of this power. For this reason, it would be contrary to primordial
political objectives.
190 Science, Technology and Society: A Philosophical Perspective
Without a doubt, one could argue that such a vision of general political
objectives is much too simplistic, and that these are more likely to be the
achievement of greater social well-being in a generic sense, or of the common
good in a classical sense. But let us not fool ourselves with respect to our times. It
is true that greater and greater levels of social well-being are being achieved, and
that the government depends on this progress for its own development and therefore
obviously fosters it. Nevertheless, this is not its ultimate goal. The government’s
fundamental reason for sponsoring social well-being, plainly stated, is simply
because if it did not do so, it would lose its political power in the next elections,
or maybe even before then. And, in function of this basic motivation, naturally it
seeks that the highest levels of social welfare be reached. The very survival of the
goverment’s political power depends on it, and this survival constitutes its first
and last specific goal.
This poses a clear example of the interconnectedness between social elements
and technological ones. Because on the one hand, preserving power as an ultimate
end is an obvious example of technological rationality in politics: the operative
efficacy, on the political level, demands holding onto power. If it were any other
way, this would be a failure, techno-politically speaking (if I can be permitted
to coin the term). Therefore, this obligation of techno-political efficacy requires
that these social demands be met, and amongst these demands, technological
action; precisely in order to satisfy its own pattern of rationality of contemporary
political activity. Because, if it were not so, the result would be a total failure,
both technologically and politically.
Of course to many people, this way of behaving and conceiving political
power will seem morally insuficient because, in fact, the parameters of social
well-being are manipulated with the previously mentioned ulterior motive, which
is a clear feature of a technological rationality applied to politics. Yet, things are
the way they are and not the way we wish them to be. Philosophically speaking
this is the cruel divide between what is and what should be, and we have no choice
but to bear with it. What one cannot do is to be unaware of the situation.15
But, such a situation is not as negative as some have believed. Because what
is clear is that for better or for worse motives, the door to the possibility of
taking action on technological change is opened after this brief description
of the political extent on technology. And this is a matter of extending these
possibilities to the maximum.
15
This is just the background of the book Ética, Tecnología y Valores en la sociedad global, in which
it is shown that, despite of these obstacles, it is possible to introduce ethical values in scientific-
technological systems and in technological society as a whole. In this regard, cf. also QUERALTÓ,
R., “Cómo introducir vectores éticos eficaces en el sistema científico-tecnológico,” pp. 221-240.
Furthermore, in order to envisage the important ethical changes provoked by technology, see
QUERALTÓ, R., “Ética y Sociedad tecnológica: pirámide y retícula,” Argumentos de Razón técnica,
v. 5, (2002), pp. pp. 39-83.
Philosophical Patterns of Rationality and Technological Change 191
the system should include, arrange, and integrate into a whole, the globality
of demands which come from its elements and reciprocal relationships. If this
second condition is not satisfied, the system will simply not work. It would cease
to be a system as such.16
However, when we find systemic elements that contradict one another, or
that are at least not naturally integrable, as is usually the case in the event of
technological change, the mutual influences will cut off the specific protagonism
of such elements until a certain agreement is reached in order to somehow advance.
It is theoretically possible that one of these elements, or a similar group that works
as one element, imposes its decisions. But, it is unlikely that the influence of the
remaining elements would be below a minimum threshold which would make
them fall into absolute inefficiency. And this is precisely because the technologial
action, designed by its own primordial nature, is a direct intervention on reality.
As a consequence, this must be considered in order to justly fulfil the internal
criteria of rationality, namely the operative efficacy on this or that reality. So,
from a systemic rationality technological determinism is excluded, because such
a conception would be incompatible with such a model of rationality, given that
technological determinism would only highlight a single sector of the network of
actors to be considered in the matter, that is, the technological sector in isolation.
But the necessary relationships with other implicated sectors, such as political,
social and others, would not be taken into account to a sufficient extent.
This last consideration leads to the second consequence which it is necessary
to examine. In order to do this, and to remain faithful to the point of view adopted
here, I will make use of a well-known philosophical aphorism, the Ortegian
principle of “I am me and my circumstances.” After the previously expounded, it
is not hard to understand the principle’s application to our problem.
In effect, technology and technological change are not only themselves as
such but “they are themselves and they are their circumstances.” In other words
everything that surrounds them and influences them with greater or lesser
intensity, in the measure of the previously mentioned feedback cycle as well as
the model of systemic rationality. So to speak, technology is not technology, but
rather, for its proper understanding, “technology is itself and its circumstances.”
Few doubts can arise in this respect if we accept the general results of the inquiry
thus far.
But the Ortegian statement includes a second part whose undoubtable
reach has unfortunately not been much considered. Because the Ortegian
idea, formulated in its totality, states the following: “I am myself and I am my
16
Cf. LASZLO, E., Introduction to Systems Philosophy, Harper and Row, New York, 1973; LASZLO,
E., The Relevance of General Systems Theory, Braziller, New York, 1972; AGAZZI, E., I sistemi fra
scienza e filosofia, SEI, Torino, 1978; and AGAZZI, E., Il bene, il male e la scienza. Le dimensioni
etiche della impresa scientifico-tecnologica, Rusconi, Milan, 1992. Of course, the well known bork
by BERTALANFFY, L. von, General Systems Theory, Braziller, New York, 1967.
Philosophical Patterns of Rationality and Technological Change 193
17
ORTEGA Y GASSET, J., Meditaciones del Quijote, in ORTEGA Y GASSET, J., Obras Completas, vol. I,
Alianza Editorial, Madrid, 1983, p. 322.
194 Science, Technology and Society: A Philosophical Perspective
today’s science and technology. (It has reached the point where some distinguished
scholars have proposed, as the main task of contemporary philosophy of science,
the elaboration of an axiology that encompasses all of the influential aspects of the
scientific-technological processes, from logical and epistemological dimensions
to ethical and social ones in general).18
The justification of such a perspective at first sight might seem too unidirectional.
The notion of value had been relegated preferentially to the philosophical realm of
ethics and social and political philosophy. So, in a first approach, if one focuses the
investigation on axiological issues, it could be argued that in reality we would be
going to the other extreme of the situation. If before the constellation of problems
was of a logical and epistemological order, with the exclusion of ethical, social and
political questions, now the situation has been reversed and, as a consequence, the
traditional analyses that have been so central to the reflection of the philosophy of
science and technology, would be excluded from the philosophical problematic.
In the end, it could be concluded that the deficient “law of the pendulum” is being
fulfilled here, swinging from one pole to the other, thereby producing harmful
exclusions on both sides.
Yet fortunately we do not believe this to be so. To be fair, the pragmatic
perspective operates in this case from the beginning with a notion of value that
is rather different from the corresponding traditional notion of philosophy. From
this inherited position, without a doubt, the proposed critique would be correct, but
precisely what happens is that in the pragmatic vision, this initial premise is very
notably modified. It is here where the cards are definitively played, that is to say, in
the meaning of the central axiological notion, namely, the notion of value.
Because the pragmatic notion of value is no longer to be considered in the
classic sense, as something that should be done because of its intrinsic quality,
which justifies itself for being held up by a more or less definitive transcendental
level. But rather the pragmatic conception understands value as a guideline for
solving problems.19 In other words something has value as long as it constitutes a
pattern or rule for overcoming problematic situations of very diverse natures. This
way of understanding value is consistent with a pragmatic perspective, because
the realm of praxis in general and its analysis take root as its own position in a
wide field of evaluations of actions in general.
However, from this basic premise of the pragmatic notion of value, there is
no difficulty in integrating all of the previously mentioned dimensions. Because,
effectively, in current scientific technological activity, problems of all these types
are considered: ethical, political, etc. So, a correct epistemological rule will have
18
This is the case, for example, of ECHEVERRIA, J., Ciencia y Valores, Destino, Barcelona, 2002, ch.
I, pp. 29-116.
19
On the relevance of this position, see LAUDAN, L., Progress and Its Problems. Towards a Theory
of Scientific Growth, University of California Press, Berkeley, 1977. On Laudan’s philosophy, cf.
GONZALEZ, W. J. (ed), El Pensamiento de L. Laudan. Relaciones entre Historia de la Ciencia y
Filosofía de la Ciencia, Publicaciones Universidad de A Coruña, A Coruña, 1998.
Philosophical Patterns of Rationality and Technological Change 195
value in the measure that it solves, in determined aspects, the problem of truth
or the epistemic validity of a statement, a law, or a scientific theory; a research
procedure will have value in the measure that it solves methodological problems;
a political decision in reference to a technological action will equally have value
in the measure that it solves problems of innovation, development, and direction
in a determined technological task. It is precisely the generality of the pragmatic
notion of value which enables it to propose an encompassing aspect of all of the
dimensions implicated in tecnological phenomena in general, without excluding
any a priori, at least in the beginning.
For this reason it is not reductive to propose the general analysis to
contemporary technoscience in terms of analysis of values, being sure to use the
practical dimension in all of its amplitude and demands.
Now, this does bring up a problem of chief importance, whose analysis will
provide the answer to the question raised at the end of the last paragraph, and which
will be developed in the last part of this contribution. It is related to the general
problem of the assessment of values in techno-scientific processes, and therefore in
the direction of technological change and their possible patterns of understanding.
From this position, and for the purpose of grasping which way to direct oneself
and how to carry out technological change, it is necessary to procede with an
assessment of the implicated values. In other words, it is necessary to procede to
a consideration of the implicated values which would make reasonable the taking
of a decision for one or another technological objective. If before it was said that
the rationality of technological change could be analyzed by means of a systemic
model, then this evaluation could be carried out using, at least initially, the features
of such a model. This immediately implies various important consequences.
In the first place, values should be selected in function of a concrete problem,
with the objective of centering the evaluative analysis. Secondly, nothing which
is identified as significant should be initially excluded, whatever its nature may
be, whether it be epistemological, ethical, political, economic, etc. In formal
terms, one could write that technological change, as it is composed of a system of
technological actions, could conform to a system of values:
Svi
In the third place, it is necessary to somehow measure the set of values, that is
to say, to quantify them to a certain extent. And this is where, without a doubt, the
most serious problem arises. Because, is such a thing possible? Can one indeed
measure a value? That depends on how a value is conceived. If one adopts a notion
of value according to classic axiology, it will be rather difficult, because with this
a justified scale of values is internally produced by the ultimate transcendental
“crown” from which the corresponding architecture is established. This would
196 Science, Technology and Society: A Philosophical Perspective
logically involves a certain rigidity. All of this is internally coherent with such a
point of departure.
In the case of a pragmatic vision of value, the guidelines for measurement
will be to assess as to what measure and at what costs –of all kinds, not only in
economic terms– the problem or problems are solved by using one or another
system of values, Sv or S’v’. For example, it can be stated that Sv highly satisfies
epistemological values, but perhaps it falls short in technological values (in
other words, in specific operative efficacy) or these last values, in this specific
technoscientific event, suppose an important decline of the presence of social-
economic values (or viceversa). In sum, the possible situations are multiple. For
this reason, it all comes down to assessing and deciding which values are to be
pondered with more attention in each technological event that we consider.
This way, we can even write inequations of the kind “more than” or “less
than” with respect to the values implicated in one direction or another,
In the fourth place, the political and economic values that are usually
intertwined, which, as has been indicated, are going to be continually present
given the growing transformation of science into technoscience and of the previous
scientific-technological praxis into a new form of praxis characterized as techno-
scientific enterprise. In this section, as an unavoidable value, one has the better
cost-benefit relationship, which in the same way as before will be tempered by the
influence of other systemic values.
And in the fifth place, last but not least, there are the social-ethical values.
Globally the value of social well-being could be considered here as representative
in its multiple dimensions, and of course, without reducing it to its material
value. To the contrary, in this group, moral values would acquire a specific
consideration, including civic and religious values. Because without including
them, not only would there be an undesirable lack of control of technological
potentiality in a strict sense, but sooner or later, there may be an excessive reaction
against technology itself, as occurs in certain social phenomena such as techno-
phobia, which perhaps has been provoked by the intent to impose technological
initiatives and direction, overriding social-ethical values. This is not good for any
of the implicated parties. Besides, in the case of technology, this would represent
precisely a certification of operative inefficacy, which, from the philosophical
perspective taken here, would be the worst thing that could happen.
Nonetheless, it is worthwhile to emphasize some final points which will make
up the conclusion, provisional still, of this paper.
23
For example, in QUERALTÓ, R., Ética, Tecnología y Valores en la sociedad global, pp. 73-109 and
150-158.
200 Science, Technology and Society: A Philosophical Perspective
24
Only as an example: “Wir haben nicht die Technik, sondern wir sind sie!,” in SACHSEE, H., Technik
und Verantwortung. Probleme der Technik im technischen Zeitalter, Rombach, Freiburg i.B., 1972,
p. 49. We refer to this quotation because it belongs to one of the first German authors that analysed
this subject. It can be found in an old work, all which can add a special significance.
25
A well accepted difference between “technology” and “technique” is the following: technology
is technique derived from and inspired by scientific knowledge, but not viceversa, because science
is an historical phenomenon starting from 16th century, but before there were other techniques in
fact. Nevertheless, at present it is possible to assert that every technique is also technology due to the
unavoidable interconnection between them. Thereby both terms are used in a similar way.
26
An inquiry about this fact can be found in QUERALTÓ, R., Mundo, Tecnología y Razón en el fin de
la Modernidad. ¿Hacia el hombre “more technico”?, PPU, Barcelona, 1993.
Philosophical Patterns of Rationality and Technological Change 201
task is already being undertaken, especially in the institutional field,27 yet there
is still a long way to go. What is most important is to draw attention to the fact
that technology and technological change are not only the work of scientists and
technologists. These professionals are not the only ones who apply technology,
rather this is made up of a complex productive system where agents of varied
nature take action, from technoscientists to managers, social leaders, etc. For
this reason, it is a matter of a collective effort in which society as a whole cannot
remove itself nor be removed. Because to the contrary the entire technosocial
system would suffer, and everyone would lose out, including technology in the
strictest sense.
In the academic world, a reaction is also being produced along this line, which
can be seen in the proliferation of studies about technological risk,28 evaluation of
technology, etc.29 It is a question of proposing models for appraising the risk to be
undertaken from one or another technology, and thereby of assuring reasonable
guidelines for decision-making. Other studies follow the same line of inspiration,
that is to say, the necessary connection between technology and society with all
of its consequences. In general, the social control of technological phenomenon
occupies the central place.
The indisputable difficulty of this task should not raise suspicions, even though,
as we have recognized before, currently the intrinsic tendency of technology to
impose and expand itself has the advantage, marking its own direction. Let us
consider this situation almost “natural”. In fact, when facing a new phenomenon,
which is exactly what the appearance of the technological power is, there is an
initial phase of expectation and social displacement in general, in which the new
phenomenon advances, almost unstoppable. This is why positions of pessimism
or impotence have arisen. This is where technological determinism has first
seemed plausible.
However, things do not have to be this way. I have tried to demonstrate and
justify this philosophically in these pages. Things are possible when they are
dispassionately proven to be so. I have attempted to make this point evident
27
For example, science and technology policies, joint working groups in ministries and parliaments,
joint research projects, etc.
28
Cf. the work by LOPEZ CEREZO, J. A. and LUJAN, J. L., Ciencia y Política del riesgo, Alianza
Editorial, Madrid, 2000, as well as the following quotation and the selected references at the end of
this paper.
29
Cf. WINNER, L., Autonomous Technology: Technics-out-of-Control as a Theme in Political
Thought, The MIT Press, Cambridge (MA), 1977; and WINNER, L., “Do Artifacts have Politics?,”
Daedalus, v. 109, (1980), pp. 121-136. In addition, Prof. Winner is promoting a “Center for Cultural
Design” in Technology at the Rensselaer Polytechnic Institute Institute of New York. His aim
is searching for the necessary balance between technology and society, which is just the main
conceptual background in our text.
Cf. also SHRADER-FRECHETTE, K., Science Policy, Ethics and Economic Methodology, Boston,
Reidel, 1984; SHRADER-FRECHETTE, K., Risk Analysis and Scientific Method, Dordrecht, Reidel,
1985; and SHRADER-FRECHETTE, K., Risk and Rationality: Philosophical Foundations for Populist
Reforms, University of California Press, Berkeley, 1991.
202 Science, Technology and Society: A Philosophical Perspective
from our own research field, that is to say, the philosophical field, because it
constitutes, collectively conciously or unconsciously, another fundamental
ingredient of our culture.
6. BIBLIOGRAPHY
AGAZZI, E., I sistemi fra scienza e filosofia, SEI, Torino, 1978.
AGAZZI, E., Il bene, il male e la scienza. Le dimensioni etiche della impresa
scientifico-tecnologica, Rusconi, Milan, 1992. (Spanish edition by R. Queraltó: El
bien, el mal y la Ciencia. Las dimensiones éticas de la empresa científico-tecnológica,
Tecnos, Madrid, 1996.)
AICHHOLZER, G. and SCHIENSTOCK, G. (eds.), Technology Policy: Towards an
Integration of Social and Ecological Concerns, W. De Gruyter, New York, 1994.
BARNES, B. and BLOOR, D., “Relativism, Rationalism and the Sociology of
Knowledge,” in HOLLIS, M. and LUKES, S. (eds.), Rationality and Relativism,
Blackwell, Oxford, 1982, pp. 21-47.
BECK, U., Risk Society: Towards a New Modernity, Sage, London, 1992.
BERTALANFFY, L. von, General Systems Theory, Braziller, New York, 1967.
BIJKER, W. E., HUGES, T. P. and PINCH, T., The Social Contruction of Technological
Systems: New Directions in the Sociology and History of Technology, The MIT Press,
Cambridge (MA), 1987.
BUSTAMANTE, J., Sociedad informatizada, ¿Sociedad deshumanizada?, Gaia,
Madrid, 1993.
DOSI, G., “Technological Paradigms and Technological Trajectories: A Suggested
Interpretation of the Determinants and Directions of Technological Change,”
Research Policy, v. ll, (1982), pp. 147-162.
DOSI, G., “Perspectivas de la teoría evolucionista,” in LOPEZ CEREZO, J. A.,
GONZALEZ, M. I. and LUJAN, J. L. (eds.), Ciencia, Tecnología y Sociedad, Ariel,
Barcelona, 1997, pp. 131-146.
ECHEVERRIA, J., Los Señores del Aire. Telépolis y el tercer entorno, Destino,
Barcelona, 2000.
ECHEVERRIA, J., Ciencia y Valores, Destino, Barcelona, 2002.
ECHEVERRIA, J., La revolución tecnocientífica, FCE, Madrid, 2003.
ELLUL, J., Le système technicien, Calman-Levym, Paris, 1977.
ELLUL, J., Le bluff technologique, Hachette, Paris, 1987.
GALILEI, G., Dialogo sui massimi sistemi ptolemaico e copernicano, in GALILEI,
G., Opere, ed. nazionale, a cura di A. Favaro, A. Garbasso, G. Abetti; Barbera,
Firenze, 1929-39, 20 vols., vol. VII.
GERGEN, K., The Saturated Self. Dilemmas of Identity in Contemporary Life,
Basic Books, New York, 1991.
Philosophical Patterns of Rationality and Technological Change 203
ORTEGA Y GASSET, J., Meditaciones del Quijote, in ORTEGA Y GASSET, J., Obras
Completas, vol. I, Alianza Editorial, Madrid, 1983.
PICKERING, A. (ed.), Science as Practice and Culture, The University of Chicago
Press, Chicago, 1992.
POPPER, K. R., Conjectures and Refutations. The Growth of Scientific Knowledge,
Routledge and Kegan Paul, London, 1963.
PORTER, A. L. (ed.), A Guidebook for Technology Assessment and Impact Analysis,
North Holland, New York, 1980.
QUERALTÓ, R., Mundo, Tecnología y Razón en el fin de la Modernidad. ¿Hacia el
hombre “more technico”?, PPU, Barcelona, 1993.
QUERALTÓ, R., “Cómo introducir vectores éticos eficaces en el sistema científico-
tecnológico,” Arbor, v. 162, no. 638, (1999), pp. 221-240.
QUERALTÓ, R., Razionalità tecnica e mondo futuro. Una eredità per il terzo
millennio, Franco Angeli, Milan, 2002.
QUERALTÓ, R., “Ética y Sociedad tecnológica: pirámide y retícula,” Argumentos
de Razón técnica, v. 5, (2002), pp. 39-83.
QUERALTÓ, R., Ética, Tecnología y Valores en la sociedad global. El Caballo de
Troya al revés, Tecnos, Madrid, 2003.
QUERALTÓ, R., “Science as Technoscience: Values and Their Measurement,”
Actes du Colloque de l’Académie Internationale de Philosophie des Sciences, 2003,
Lecce (Italy), forthcoming.
R ESCHER, N., Risk: A Philosophical Introduction to the Theory of Risk Evaluation
and Management, University Press of America, Lanham, 1983.
R IP, A., MISA, TH. and SCHOT, J. (eds.), Managing Technology in Society. The
Approach of Constructive Technology Assessment, Pinter, London, 1995.
SANMARTÍN, J., Tecnología y futuro humano, Anthropos, Barcelona, 1990.
SANMARTÍN, J. (ed.), Estudios sobre Sociedad y Tecnología, Anthropos, Barcelona,
1992.
SARTORI, G., Homo Videns, Laterza, Roma-Bari, 1997. (Spanish translation: Homo
Videns. La sociedad teledirigida, Taurus, Madrid, 1998).
SACHSEE, H., Technik und Verantwortung. Probleme der Technik im technischen
Zeitalter, Rombach, Freiburg i.B., 1972
SHRADER-FRECHETTE, K., Science Policy, Ethics and Economic Methodology,
Boston, Reidel, 1984.
SHRADER-FRECHETTE, K., Risk Analysis and Scientific Method, Reidel, Dordrecht,
1985.
SHRADER-FRECHETTE, K., Risk and Rationality: Philosophical Foundations for
Populist Reforms, University of California Press, Berkeley, 1991.
SMITH, M. R. and MARX, L. (eds.), Does Technology Drive History? The Dilemma
of Technological Determinism, The MIT Press, Cambridge (MA), 1994.
Philosophical Patterns of Rationality and Technological Change 205
207
208 Science, Technology and Society: A Philosophical Perspective
utility, 142
utopianism, 124
211
212 Science, Technology and Society: A Philosophical Perspective
POPPER, K. R., x, 17, 36, 59n, 61-63n, 66n, R IP, A., 59n, 76, 204
75, 124, 131, 196, 204 ROCHON, P., 121n, 131
PORTER, A. L., 204 RODRIGUEZ ALCAZAR, J., 92n, 105
PORTER, C., 43 ROMEO CASABONA, C. M., 44
POSTEL, S., 116n, 131 ROMERO, A., 161n, 165n, 174-175
POTTER, R., 156 ROPOHL, G., 155, 203
PRIBRAM, K., 156 ROSENBERG, N., 47
PRICE, D., 48 ROWE, J., 65n, 76
PRICE, D. J. DE SOLLA, 8, 10n, 46, 156-157 RUBEN, D. H., 21n, 47
PRIMACK, J., 121n, 132 RUDGE, D. W., 84n, 104
PROCTOR, R. N., 156 RUDMAN, R., 68n, 77
PRYOR, J., 69n, 75 RUDNER, R., 60n, 76
PYPKE, D., 52n, 75
SACHSEE, H., 200n, 204
QUADE, E. S., 66n, 75 SAHAL, D., 47
QUERALTO, R., vi, 35-37, 179, 183n, 185n, SALMON, M., 36
190n, 196n, 199n-200n, 202, 204 SALMON, W., 36
QUINE, W. V. O., 55, 60n, 76 SAMUELS, S., 66n, 76
QUINTANILLA, M., 115n SANCHEZ RON, J. M., 165n, 175
QUINTANILLA, M. A., 46 SANGREN, P. S., 156
SANMARTIN, J., 204
R ACHELS, J., 120 SANTESMASES, M. J., vi, 35-36, 159, 161n,
R ADER, K., 160n, 175 174-175
R ADNITZKY, G., 46 SANTOPIETRO, G., 116n, 132
R AFFENSPERGER, C., 92n, 105 SAREWITZ, D., 156
R AIKKA, J., 117n, 131 SARO-WIVA, K., 51-52
R APP, F., 18n, 38-39, 44, 46-48, 138n, 153- SARTORI, G., 185n, 204
157 SARTRE, J. P., 118-119, 131
R ASMUSSEN, N. C., 65n, 76, 164, 170-171, SASSOWER, R., 47
175 SAYRE, K. M., 152, 157
R AWLS, J., 69n, 76 SCARLOTT, J., 53n, 76
R EAGAN, M. D., 156 SCARPITI, F., 65n, 76
R EALE, M., 61n, 76 SCHÄFER, W., 47, 152, 154, 157
R EICHENBACH, H., 17, 85 SCHAFFER, S., 47, 159n-160n, 174, 176
R EICHHARDT, T., 64n, 76 SCHARFF, R. C., 4n, 17n, 19n, 22n-23n, 29n,
R EPETTO, R., 109n, 131 40, 42, 45, 47, 49
R ESCHER, N., 5n, 10n, 20n-21n, 28n, 32n, SCHEFFLER, I., 63, 76
46-47, 66n, 76, 94, 105, 117n, 131, 152, 204 SCHIENSTOCK, G., 202
R ESNIK, D. B., 24n, 47 SCHNIEDER, T., 110n, 131
R HEINBERGER, H. J., 160n, 165n, 170n, 172, SCHOENHEIMER, R., 166
175 SCHOT, J., 204
R ICCI, P., 63n, 72 SCHWAB, J., 115n, 131
218 Science, Technology and Society: A Philosophical Perspective