0% found this document useful (0 votes)
68 views

Ethics On The Feedback Loop

This document discusses how engineers can integrate consideration of societal impacts into the iterative design process through a concept called "Ethics on the Feedback Loop." It argues engineers should examine not just physical performance but also societal impacts with each design iteration. The document outlines how uncertainties in materials, manufacturing, and use mean engineered products can have unintended consequences, making engineering akin to experimenting with human subjects. It illustrates this through a case study of accidents caused by a medical electron accelerator whose design flaws were not caught through testing.

Uploaded by

Santiago Ospina
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views

Ethics On The Feedback Loop

This document discusses how engineers can integrate consideration of societal impacts into the iterative design process through a concept called "Ethics on the Feedback Loop." It argues engineers should examine not just physical performance but also societal impacts with each design iteration. The document outlines how uncertainties in materials, manufacturing, and use mean engineered products can have unintended consequences, making engineering akin to experimenting with human subjects. It illustrates this through a case study of accidents caused by a medical electron accelerator whose design flaws were not caught through testing.

Uploaded by

Santiago Ospina
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Control Engineering Practice 6 (1998) 239—245

Invited Paper
Ethics on the feedback loop
Roland Schinzinger*
Electrical and Computer Engineering Department, University of California, Irvine, USA
Received August 1996; in revised form December 1997

Abstract

The design, manufacture, and supervision of complex systems usually proceeds as an iterative activity involving multiple feedback
loops. During each iteration, as physical parameters are re-examined for possible changes, the system’s societal impacts should receive
as much attention as does the system’s purely physical performance. ( 1998 Elsevier Science ¸td. All rights reserved.

Keywords: Design and feedback; software safety; experiment; responsibility; engineering ethics

1. Introduction start and end at the same respective stages during sub-
sequent passes through the design and production pro-
Designs of systems and devices are usually carried out cesses because the retracing is governed by the latest
in an iterative manner. Each iteration is an attempt to findings from current results, tempered by the outcome of
improve the performance of the product by modifying earlier iterations and experience with similar product
physical parameters, while at the same time forcing the designs.
satisfaction of imposed constraints. Usually, one cannot All too often the engineer considers only the physical
progress in an uninterrupted manner straight through aspects of the product. The impacts of its use on the
the many stages involved in designing and manufactur- owner, user, community, and the natural environment
ing a product or system. The design phase includes are conveniently assumed to be covered by standards
conceptual design, definition of detailed goals and speci- (often outdated in rapidly developing technologies), de-
fications, prototyping and preliminary testing, followed sign specifications, or by ‘‘other specialists’’ somewhere in
by detailed design and preparation of shop drawings. the organization. No wonder important side-effects of
Manufacturing involves scheduling the manufacture of products are often not considered until either it is too late
parts, purchasing materials and components, fabricating or the necessary changes become prohibitively expensive.
parts and subassemblies, and finally assembling and per- Fortunately, the opportunity for improving the product’s
formance-testing the product. Selling is next (unless the social and environmental impacts can occur during
product is delivered under contract), and thereafter either the iterations in the design and production-planning
the manufacturer’s or the customer’s engineers perform processes.
maintenance, repair and geriatric service, and ultimately The integration of physical and societal considerations
recycling or disposal. in the iterative process allows engineers to realize their
Each stage presents engineers with a number of op- professional obligations more fully. One can label this
tions, resulting in many possible ways of proceeding. So approach Ethics on the Feedback ¸oop. The idea of treat-
it is natural that engineers usually make an initial stab, ing engineering ethics problems as if they were design
stop along the way when they hit a snag or think of better problems was advanced by Caroline Whitbeck (1990) as
solutions, and return to some earlier stage with improve- a way to interest engineering students in ethical decision-
ments in mind. This retracing constitutes a feedback making. Michael Rabins (Harris et al., 1995) emphasizes
operation. Such a feedback path does not necessarily the role of feedback in this connection. It may be appreci-
ated then that engineers, educated and used to applying
design iterations, can readily apply this approach to
*Corresponding author. E-mail: [email protected] address in a holistic manner not only concerns of purely

0967-0661/98/$19.00 ( 1998 Elsevier Science Ltd. All rights reserved


PII S 0 9 6 7 - 0 6 6 1 ( 9 8 ) 0 0 0 0 7 - 0
240 R. Schinzinger/Control Engineering Practice 6 (1998) 239—245

physical performance, but also concerns of an ethical jects, all those who could possibly be affected by the
nature. The process must, however, take into account experiment should be contacted and afforded the oppor-
that the properties of the materials as actually delivered, tunity to give or withhold their informed consent as
the shop procedures as actually carried out, and even the participating subjects (Martin and Schinzinger, 1996,
user’s application of the final product, may not exactly p. 84). The parties affected might be workers and testers
coincide with what the designer had specified or ex- on the shop floor, shareholders of producer and buyer,
pected. The need to examine such uncertainties and their operators (say, pilots) and indirect users (airline passen-
effects on a product’s actual performance, will be exam- gers), or mere bystanders (those living below a flight
ined next. It leads to the view of engineering as an path). And even after the product has ended its useful life,
experiment with human subjects (Martin and Schinzinger, the health of people living next to recyclers, landfills, and
1996, Ch. 3). incinerators must be considered.
In today’s complex technology, few engineers could be
expected to continually keep track of a product’s many
2. Uncertainties and experimentation actual and possible uses and all the individuals affected,
but the engineering-as-experiment paradigm serves as
An engineered product (device or system) results from a reminder of what the engineer needs to keep in mind as
the interaction of human activity, tools of manufacturing, the product design undergoes yet another iteration fol-
and materials. Each of these is beset by uncertainties. The lowing preliminary design reviews and tests. The task of
designer can never know all the pertinent physical laws predicting all the ways in which a product may actually
and properties of materials, even in their ideal states. It is be misused by its owner is daunting, and no engineer can
also difficult to foresee all the possible uses to which be expected to be responsible for all unforeseen applica-
a customer may subject a product. Finally, the materials tions. Section 8 will take up some special circumstances
and parts used, as well as the manufacturing process, may involving control engineering. The emphasis here is on at
have hidden defects. Testing may not detect them, just as least taking the time to imagine possible system failures
testing of the final product does not tell how well it will or misuses, and to provide generic safety and escape
withstand unforeseen stresses, particularly when a test measures. What has been said so far will now be illus-
does not anticipate a peculiar defect, or when it does not trated by means of a case study.
carry the product to destruction.
Moving on to the use of the product, the limitations of
the product may not be clear to the original or later 3. Case study: A medical electron accelerator
owners of the product. Similar gaps may exist in the
knowledge of the end-user regarding suitable means of In the 1980s a series of tragic accidents resulted from
disposal. These are but a few of the many occurrences the use of a new radiation-therapy machine, the Therac-
which can make a product less than useful, if not outright 25 medical electron accelerator. (See Jacky, 1989;
dangerous. Leveson and Turner, 1993; Rose, 1994; Peterson, 1995.)
The significance of these uncertainties is enhanced The Therac-25 is a dual-mode linear accelerator for
when one regards engineering as an experimental activ- therapeutic use. In mode ‘‘X’’ its electron beam is directed
ity, even though engineering usually lacks the involve- at a target at full output level (25 MeV) to produce X-
ment of control groups in the usual sense. In a restricted rays for the treatment of deep-seated tumors. Shallow
sense, however, one may even regard the design process tissue treatments are carried out in the electron mode
itself as an experiment. Each iteration in the perfection of ‘‘E’’, where the beam is shaped and attenuated to variable
a design, whether one starts over again from an earlier energy levels of 5 to 25 MeV. In a third mode, an ordi-
design on paper, or after a simulation run or a prototype nary light beam and a mirror are used to simulate the
test, is an experiment which uses the preceding design as treatment beam so the patient can be positioned proper-
a ‘‘control’’ design against which one calibrates improve- ly. A turntable is employed to produce the desired treat-
ments in efficiency, cost, satisfaction of constraints, and ment modes as follows: for mode X the turntable moves
achievement of goals. into the path of the electron beam a tungsten target,
The physical realization of a design through the manu- a cone to produce a uniform treatment field, and an ion
facture and assembly of parts is likewise in many ways an chamber to measure the delivered X-ray dose. For mode
experiment, as is the selling and commissioning of the E, it replaces the above by scanning magnets to shape the
final product. Here again, one must demand the close beam, and another ion chamber to measure the delivered
attention of the experimenters (the engineers). They dose of electrons. The position of the turntable is sensed
ought to monitor the experiment (the product) through- by three microswitches and reported to the machine’s
out its life and terminate the experiment (for instance, by computer.
recalling the product) when safety can no longer be By 1987, Atomic Energy of Canada Ltd. (AECL), the
assured. As with any experiment involving human sub- manufacturer of the Therac-25, had sold and installed six
R. Schinzinger/Control Engineering Practice 6 (1998) 239—245 241

units in Canada and five in the US. Some of the model duplicate all the existing hardware safety mechanisms
25s had apparently been functioning normally since and interlocks. This approach is becoming more com-
1983, with hundreds of patients treated, when the first mon as companies decide that hardware interlocks and
known malfunction occurred in June 1985, resulting in backups are not worth the expense, or they put more
a lawsuit filed five months later. By January 1987, six faith (perhaps misplaced) on software than on hard-
patients had suffered massive overdoses of radiation dur- ware reliability.’’
ing treatment with Therac-25s. Three of these patients (Leveson and Turner, 1993)
died as a consequence of overexposure. Others lingered
in pain, and one underwent an otherwise avoidable hip As it turned out, the malfunctions of the Therac-25
replacement before dying of cancer. occurred because of one or more software errors. One
The first incident occurred during radiation treatment arose because of certain race conditions that can accom-
following a lumpectomy to remove a malignant breast pany rapid input and changing of instructions for multi-
tumor. During the twelfth session, the patient felt intense tasking operations. Thus, when set-up data was entered
heat. A dime-sized spot on her breast, along with a some- via the computer terminal, the mode (X or E) had to be
what larger exit mark on her back, indicated penetration specified. Since X was the more common treatment
by an unmodified, strong electron beam. Tim Still, the mode, the Therac operators could be expected to mis-
radiation physicist at the hospital, queried AECL, only to takenly enter ‘‘X’’ somewhat by habit, even when an ‘‘E’’
be told that the Therac-25 could not possibly have pro- was called for. But the operator could easily make a cor-
duced the specified E-mode beam without attenuating rection by hitting the up-arrow key and replacing the
and spreading it as required. The oncologist then pre- X by an E with a few keystrokes. AECL had actually
scribed continued treatment on the same Therac-25. introduced this correction feature in response to
When the cause of the initial burns was clearly identified complaints by operators that starting all over again
as radiation burns due to one or more doses in the 15,000 with data input after the error had been detected was too
to 20,000 rad range instead of the usual 200 rad, the cumbersome.
patient initiated a lawsuit against the hospital and When now a quick X-to-E correction was made by the
AECL. Eventually the patient had to have her breast operator, and when this was done within eight seconds
removed, lost control of her shoulder and arm, and after the full prescription had been entered, the X-ray
was in constant pain. Later she died in an automobile target would be withdrawn properly, but the electron
accident. beam would already have been set by the computer to its
This type of machine malfunction was to occur again, maximum energy level to comply with the earlier X-ray
but AECL was unable to replicate the events and delayed command. Thus the patient would be subjected to an
warning other Therac-25 users of the machine’s appar- excessively powerful and concentrated electron beam.
ently erratic behavior. According to Rose (1994), Dr. Still The timing was critical, the occurrences rare, and the
had discussed his early observations with colleagues in the cause only detected with difficulty by the hospital’s
profession, with the result that AECL had warned him not medical physicist.
to spread unproven and potentially libelous information.

5. Lack of a mechanical interlock


4. Software errors
Another malfunction mode based on a software error
AECL was still a crown corporation of the Canadian arose from the manner in which the turntable position
government when it collaborated with the French com- was sensed and digitized. The error was inadvertently
pany CGR in the design and manufacture of the earlier introduced when AECL attempted to guard against un-
6-MeV linear accelerator Therac-6 and the 20-MeV intended turntable positions following early reports of
model Therac-20. Both were based on older CGR ma- radiation overdoses (Leveson and Turner, 1993; Rose,
chines. Later, AECL developed Therac-25 on its own. 1994). When a counter was introduced to determine and
The earlier CGR machines relied on manual setups to report to the computer the changing position of the
and hardware interlocks. Some computer control was turntable in transit, the counter would reset itself to zero
added to these Theracs to make them more user-friendly, upon reaching 256, apparently a value considered high
but the hardware was still capable of standing alone. by the programmer.
Therac-25, however, was designed with computer control A similar reset problem is now said to worry many
in mind from the ground up. As Leveson and Turner businesses and agencies whose immense accounting and
write, planning programs unthinkingly instruct computers to
reset time to zero as 1999 ends, rather than going on to
‘‘AECL took advantage of the computer’s ability to year 2000. Such a mistake will not directly affect a human
control and monitor the hardware and decided not to life, but it does point to the difficulty of spotting errors as
242 R. Schinzinger/Control Engineering Practice 6 (1998) 239—245

the software is being written and installed. Thus it is patient (Cheng and Kubo, 1988; Loyd et al., 1989). An
always important to provide a safety mechanism that is assumed dose based on calculations involving the pre-
truly independent of the control computer and its pro- scription and presumably correct machine settings, or
gram. The mechanical interlocks used on the Therac-20 a reading derived from a dosimeter that is not even
would be an example. exposed to the actually delivered radiation, is not suffi-
cient. Direct-reading, well-calibrated dosimeters are a vi-
tal element in the feedback of information from the
6. Safe exit treatment table to the operator. After all,
‘‘[The] dose monitoring system is the last level of
Let us now look at one more report of a radiation
protection between the patient and the extremely
overdose from a Therac-25:
high dose rate which all accelerators are capable of
‘‘[A] patient’s treatment was to be a 22-MeV treat- producing’’.
ment of 180 rad over a 10]17 cm field 2 , or a total (Galbraith et al., 1990)
of 6,000 rad over a period of 61 weeks2 . After-the-
2 As reported by Rose (1994), staff at a hospital in
fact simulations of the accident revealed possible doses
Toronto installed a dose-per-pulse monitor that could
of 16,500 to 25,000 rad in less than 1 second over an
measure the radiation delivered by a beam and in a frac-
area of about 1]1 cm2 . He died from [horrible]
tion of a second shut down the machine.
complications of the overdose five months after the
In case of serious mishaps, corrective action can be
accident.’’
undertaken that much sooner when proper instrumenta-
(Leveson and Turner, 1993)
tion is available. Equally important is the presence of
Other aspects of this particular incident are note- experts at potentially life-threatening treatment or work
worthy because the machine’s operators had to rely on sites. Three Mile Island, Bhopal, and Chernobyl have
their own best guesses when abnormalities occurred. The shown that good measurements and the presence of on-
natural tendency would often be to shrug off unusual site experts who can evaluate the data are the sine qua
machine behavior as ‘‘just another one of its quirks’’. But non of safety and the avoidance of disaster (Martin and
the consequences were serious. For instance, soon after Schinzinger, 1996, pp. 168—170). Beyond that, a ‘‘safe
hitting the proper key to begin treatment of the patient exit’’ directly accessible to the patient (or, in general, the
mentioned above, the machine stopped and the console ‘‘ultimate subject of the experiment’’) must be provided.
displayed the messages ‘‘Malfunction 54 (dose input 2)’’
and ‘‘Treatment Pause’’. There was no explanation to be
had what kind of ‘‘Malfunction’’ this was, not even in the 7. The regulatory agency
manuals, though later inquiry from the manufacturer
indicated that ‘‘dose input 2’’ meant a wrong dose, either It is appropriate to introduce here another line of
too low or too high. The ‘‘Pause’’ message meant a prob- defense, one for early prevention, even though it may
lem of low priority. The dose monitor showed only appear to be remote from the engineer and the user, and
6 units of radiation delivered, when 202 had been speci- that is the regulatory agency. When the early Therac-25
fied. The operator was used to erratic behavior of the accidents occurred, it was (in the United States) up to the
machine, and since on earlier occasions the only conse- manufacturer to report malfunctions of radiotherapy
quences had been inconvenience, she continued the treat- equipment to the US Food and Drug Administration
ment. Soon she was horrified to hear the patient pound (FDA). The user was not obliged to do so. Those users
on the door of the treatment room. He had received what who did call the FDA, if only to learn what experience
felt like two severe electric shocks, apparently one after other cancer centers might have had with the Therac-25,
each time the start button had been pushed. This was his could not find out anything, because AECL had not yet
ninth session, so he knew something was wrong. reported. Industry engineers, fearful of giving away trade
Why did he have to get off the table and pound on the secrets or of inviting law suits, are too often reluctant to
door? Because even the simplest of emergency cords or share information on malfunctions. Recall procedures
buttons or other ‘‘safe exits’’ were lacking (Martin and are in force in the health equipment industry, but time
Schinzinger, 1996, p. 179). Instead, in the case under lags often hinder effective implementation. This state of
discussion, audio and video monitors were provided. As affairs should remind us that as responsible engineers we
could be expected to happen at times, the audio monitor should assist the regulatory process and improve it, in-
was broken and the video monitor had been unplugged, stead of labeling it a nuisance and trying to impede it. At
but the radiation treatment was conducted anyway. the same time, it must be recognized that regulations per
More to the point is the general lack throughout the se often do not keep up with changes in technology and
industry of accurate, reliable instruments that tell that adherence to regulations may also lead to merely
an operator the actual radiation dose delivered to the minimal compliance. What regulations — and laws in
R. Schinzinger/Control Engineering Practice 6 (1998) 239—245 243

general — can do is to give responsible engineers the Design of equipment:


support they need to correct or stop projects with clear Unforeseen ambient temperatures or corrosion.
safety problems that have been left unaddressed.
Instead of adding fixes upon fixes to systems so that
they supposedly cannot fail, it is better to have on hand
8. The control engineer capable operators who can handle an emergency, who
are backed up by built-in overrides, and to have safe exits
Human ingenuity has created control systems of great for those who might get hurt. It is the engineer’s and
variety for centuries. At an earlier time such systems plant manager’s ethical obligation to see that such are in
would automatically position windmills to face the wind, place. The Therac cases illustrate these points. It could be
cause patterns to be woven into fabrics, or govern the argued that proper training of operators would suffice,
speed of steam engines. Now computer technology but it is often observed that after good classes for the first
makes it possible to control the operations of very large crew, later newcomers are given but cursory training,
systems such as complete electric generating stations and frequently just by members of an earlier crew. And how
the networks connecting them, or chemical process often are operating manuals really updated and kept in
plants, robotic manufacture, and jumbo airplanes. place, even at the urging of the manufacturer? Or, even if
The growth of complexity places ever greater responsi- they are, how clear are the instructions? Has the damage
bilities on all engineers for the failure-free operation of been done by the time the right place in the manual has
their systems. This is no small task because it can be said been found? What about the decision lag which so fre-
that as systems grow in size, they almost invite failures. quently besets operators faced with shutting down a sys-
Charles Perrow (1984) labeled them ‘‘normal accidents’’. tem when doing so unnecessarily may bring blame?
They can happen when even one small equipment error is Such operational questions do not usually spring to
propagated throughout the system by the myriad inter- mind during the design and implementation phases, but
connections among its many components, or when one of they ought to, especially when the consequences can be
the myriad instructions of the control computer’s soft- life-threatening. A safe product or system is ultimately
ware is the wrong one under unforeseen circumstances. one that users and bystanders can safely jettison or
Add operator error, and one can appreciate the difficul- escape from. Engineers responsible for safety and reliabi-
ties of keeping accidents from happening. Very much to lity should similarly be reminded that even if the theoret-
their credit, control engineers have a good track record of ical probability of system failure is low, one must still
safety. allow for the possibility of failure that could lead to loss of
Nevertheless, it will still be claimed that many of the life, livelihood, and investment. (The engineer’s or the
frightening disasters such as at Chernobyl or Bhopal engineering firm’s reputation is also at risk — a matter of
could have been avoided by further introduction of auto- prudent self-interest that encourages adequate insurance
mation and less reliance on fallible human operators. coverage but does not replace all personal responsibility.)
Such arguments detract from the real reasons that these Failure mitigation introduced during design usually costs
accidents turned into disasters: lack of foresight to pre- much less than will later retrofits. In the implementation
pare for accidents and to allow for safe exits of those phase there is also the need to alert local authorities of
exposed to danger in such situations. In Bhopal even the possible malfunctions and their effects, especially when
local authorities had not seen fit to institute evacuation there are poisonous chemicals involved. Fire crews need
plans, despite the high density of population near the to know what they will encounter and how to douse fires;
plant and the repeated public warnings by a concerned the police need to be prepared with evacuation plans;
local journalist, Rajukman Keswani, before the catas- nearby hospitals need to know the treatment protocols.
trophe (Tempest, 1984).
The plea for more automation also overlooks the fact
that human beings are still involved in the design, manu- 9. Ethics in the workplace
facture, and surveillance of such complex systems. What
happened with the Therac-25 is not much different from Engineers want to act professionally, and they know
occurrences elsewhere. For example, consider the follow- what that means: stay competent, deliver quality work,
ing typical occurrences in selected areas: be honest. Upon giving further thought to their responsi-
bilities, reading up on — or taking a course in — engineer-
Instrumentation:
ing ethics, or better yet, just engaging colleagues in
Faulty, inaccurate, or misleading, causing operators to
conversation on the topic, they may add to the list:
disregard readings.
promote the public good, do not let one’s product harm
Control computer: people or the environment. It is not easy to fulfill these
Software errors, data inflow rate too high, or failure of obligations as an employed engineer. A supposedly effi-
electronic circuits. cient division of labor has led to narrow tasks, and what
244 R. Schinzinger/Control Engineering Practice 6 (1998) 239—245

has not been thought of by the task assigners falls be- themselves control it. Hoping that there is some other
tween the cracks or is written off as something that can engineer down the line of production who has been
be postponed. Worse yet, if a safety problem originates in designated to check for errors is not sufficient, because
a different department, or if it could be readily remedied even if there is, the error may still be overlooked again. It
there, it is very difficult to obtain quick resolutions via is necessary to personally bring such cases to the atten-
official, interdepartmental channels. Direct, personal tion of colleagues and superiors. They, in turn, must be
links among engineers work much better. receptive to such reports of concern. If a recall is necessary
Given the specificity of today’s engineering tasks, en- — from the design office, the shop floor, or the customer’s
gineers and their managers may want to consider other premises — it had best occur as early as possible!
attributes necessitated by the very complication of the An ethical corporate climate helps, of course. But it is
modern work environment. Such strengths as integrity, particularly important not to fall into the trap of thinking
foresightedness, autonomous ethical decision-making, that an organization could rely on conveniently legalistic
and moral courage are essential to the character of the compliance strategies. These appear to be much favored
responsible engineer. by lawyers and executives, who can then lay sole blame
Good managers avoid fragmentation of the workplace, for organizational failures on individuals who have
but where rivalries between departments or agencies are supposedly acted contrary to the organization’s rules,
great, it persists. The story of the Challenger disaster, for whether these actually promote ethical behavior or not.
instance, tells how the several engineering groups and Specific compliance rules are suitable only in very struc-
their counterparts at NASA sites felt differently about tured settings such as purchasing and contracting. Gen-
risks in general, and about proceeding with the shuttle’s erally, a philosophically based ethics strategy is more
launch under problematic conditions in particular (Feyn- effective for the many ‘‘experimentally’’ based, open-
man, 1988). Engineers have to muster moral courage if ended functions. Such a strategy is based on autonomous
they want to overcome such barriers and stand up for ethical decision making, not on mere observance of laws
what is the correct thing to do: for example, to protect the and regulations. This strategy is also the best way to
right of the owner or operator of a product not to be impress on engineers that responsibility for success or
harmed by its use, or at least to issue timely warnings of failure is mostly not divisible, on the job nor elsewhere in
potential hazards to those most directly affected (includ- life.
ing the captain and crew of a ship or space shuttle). The There is another problem that employed engineers
ability to do that is the true mark of professionalism and face, and that has to do with the fact that managers and
a test of a professional’s integrity. engineers may differently interpret information coming
The concept of integrity is important to good engineer- by way of the feedback loop. This was pointed out at the
ing in several contexts. The design process must exhibit IFAC Congress by panelist Mike Martin (1996) who
integrity, in that it must recognize that no single function gave the Challenger case as an example. O-rings that
can be governed well without consideration of all other should seal segments of the booster rockets had shown
functions. The product itself must exhibit integrity, in signs of erosion after prior launchings at low temper-
that it must function well with respect to several at- atures. A redesign of the rocket was already underway,
tributes, such as efficiency, economy, safety, and reliabil- but so far a recall had not been issued. The engineers
ity, while at the same time being environmentally benign asked that the spaceship should not be launched at the
and aesthetically pleasing. It would also be difficult to very low temperatures that were expected at the launch
associate the concept of integrity with a product for site. Management (also engineers, but wearing their
killing or maiming humans, such as land mines, or a sol- ‘‘management hats’’) interpreted the experiences with
vent to be exported knowingly for use in preparing illicit prior launches differently: no launches had failed, and as
cocaine. Finally, integrity should be recognized as a main long as failure could not be forecast with certainty, the
attribute of character — the character of the engineer. planned launch should proceed. The engineers were not
Integrity implies that the engineer as a matter of habit prepared for such a response, nor were they uniformly
feels accountable for her or his work, and therefore ex- firm in standing up to management pressure. So the
hibits the usual attributes of ethical conduct. Another ‘‘experiment’’ was carried out, apparently without the
important attribute of character is foresight, the ability commander and the crew of the Challenger being notified
and the effort to look ahead. It quite naturally leads to of the situation — an example of ‘‘experimentation with-
the exercise of caution, an indispensable ingredient of out consent of the subjects involved’’.
responsible experimentation.
Foresight should encompass the whole range of tech-
nological activities, from design to recycling and the 10. Conclusion
effects of external influences. It also means that it is not
right for engineers to disregard any problematic design The title Ethics on the Feedback ¸oop had the purpose
feature or manufacturing process, even if they do not of drawing attention to the possibility of acting on ethical
R. Schinzinger/Control Engineering Practice 6 (1998) 239—245 245

convictions, not only by truth-telling after a problem has References


arisen, but every time one examines a design, a produc-
tion process, or a sales venture. Engineers constantly Cheng, Pocheng, Hideo Kubo, 1988. Unexpectedly large dose rate
draw on their own, or the organization’s, experience. This dependent output from a linear accelerator. Med. Phys., 15(5),
very activity demonstrates the existence of a feedback 766—767.
Feynman, R.P., as told to Ralph Leighton, 1988. What Do
loop in learning. Feedback should be used not only for You Care What Other People Think? W. W. Norton and Co.,
improvement in strictly technical matters, but also in New York.
learning about their social implications. Ethics and Galbraith, D.M., Martell, E.S., Fueurstake, T., Norrlinger, B, Schwen-
strength of character are critical in putting this learning dener, H., Rawlinson, J.A., 1990. A Dose-Per-Pulse Monitor for
to use. Several texts discuss ethical decision making in the a Dual-Mode Medical Accelerator. Med. Phys., 17(3), 470—473,
May/June.
engineering milieu, e.g., (Harris et al., 1995; Unger, 1994; Harris, C.E. Jr, Pritchett, M.S., Rabins, M.J., 1995. Engineering Ethics.
Martin and Schinzinger, 1996). Here there is just enough Wadsworth Publishing Co., Belmont, CA.
space to provide a summary through a sampling of pithy Jacky, J., 1989. Programmed for Disaster. Sciences 29(5), 22—27
expressions: (Sep/Oct).
Leveson, N.G., Turner, C., 1993. An Investigation of the Therac-25
‘‘Ask not only can it be done, but also should it be Accidents. Computer (IEEE), July 1993, pp. 18—41.
done’’ (Wujek, 1996) Loyd, M., Chow, H., Laxton, J., Rosen, I., Lane, R., 1989. Dose Delivery
‘‘The world needs engineers with the moral courage to Error Detection by a Computer-Controlled Linear Accelerator. In
Med. Phys. 16(1), 137—139, Jan/Feb.
speak the truth’’ Martin, M.W., 1996. Integrating Engineering Ethics and Business
‘‘Character counts!’’ (Michael Josephson of the Joseph Ethics. Presented on panel for topic Ethics on the Feedback Loop at
and Edna Josephson Institute of Ethics, Marina del 13th IFAC World Congress, San Francisco, 1996.
Rey, CA. U.S.A.) Martin, M.W., Schinzinger, R., 1996. Ethics in Engineering. 3rd ed.,
‘‘Ethics is not for wimps’’ (Michael Josephson) McGraw-Hill Book Co., New York.
Perrow, C., 1984. Normal Accidents: Living with High-Risk Technolo-
gies. Basic Books, New York.
Acknowledgements Peterson, I., 1995. Fatal Defect — Chasing Killer Computer Bugs, Times
Books (Random House), New York.
Rose, B.W., 1994. Fatal Dose. Saturday Night, pp 24#, Toronto,
Michael J. Rabins of Texas A and M University invited Canada, June 1994. Also in Social Issues Resource Series, vol 4,
the author to address the subject of engineering ethics at d28.
the 13th World Congress of the International Federation Tempest, R., 1984. India Plant Safety Report Had Warned of Gas Leak.
of Automatic Control in San Francisco, 1996. M.G. Los Angeles Times, Dec 11, p.1.
Unger, S.H., 1994. Controlling Technology: Ethics and
Rodd, editor of Control Engineering Practice, suggested the Responsible Engineer. 2nd ed., John Wiley and Sons,
a publishable version. This paper was begun while the New York.
author was still at Meiji Gakuin University (MGU) in Whitbeck, C., 1990. Ethics as Design: Doing Justice to Moral Problems.
Japan as a visiting professor and director of an Educa- Texas A&M Center for Biotechnology Policy and Ethics, 1692-1.
tion Abroad Program of the University of California on See also quotes in V. Weil (1992), Engineering Ethics in Education,
p. 3, Center for the Study of Ethics in the Professions, Illinois Inst. of
MGU’s Totsuka campus. The reviewers of the paper Tech., Chicago.
made helpful suggestions. The author thanks all of the Wujek, J.H., 1996. Panelist on topic Ethics on the Feedback Loop at
above for their help. 13th IFAC World Congress, San Francisco, 1996.

You might also like