Ethics On The Feedback Loop
Ethics On The Feedback Loop
Invited Paper
Ethics on the feedback loop
Roland Schinzinger*
Electrical and Computer Engineering Department, University of California, Irvine, USA
Received August 1996; in revised form December 1997
Abstract
The design, manufacture, and supervision of complex systems usually proceeds as an iterative activity involving multiple feedback
loops. During each iteration, as physical parameters are re-examined for possible changes, the system’s societal impacts should receive
as much attention as does the system’s purely physical performance. ( 1998 Elsevier Science ¸td. All rights reserved.
Keywords: Design and feedback; software safety; experiment; responsibility; engineering ethics
1. Introduction start and end at the same respective stages during sub-
sequent passes through the design and production pro-
Designs of systems and devices are usually carried out cesses because the retracing is governed by the latest
in an iterative manner. Each iteration is an attempt to findings from current results, tempered by the outcome of
improve the performance of the product by modifying earlier iterations and experience with similar product
physical parameters, while at the same time forcing the designs.
satisfaction of imposed constraints. Usually, one cannot All too often the engineer considers only the physical
progress in an uninterrupted manner straight through aspects of the product. The impacts of its use on the
the many stages involved in designing and manufactur- owner, user, community, and the natural environment
ing a product or system. The design phase includes are conveniently assumed to be covered by standards
conceptual design, definition of detailed goals and speci- (often outdated in rapidly developing technologies), de-
fications, prototyping and preliminary testing, followed sign specifications, or by ‘‘other specialists’’ somewhere in
by detailed design and preparation of shop drawings. the organization. No wonder important side-effects of
Manufacturing involves scheduling the manufacture of products are often not considered until either it is too late
parts, purchasing materials and components, fabricating or the necessary changes become prohibitively expensive.
parts and subassemblies, and finally assembling and per- Fortunately, the opportunity for improving the product’s
formance-testing the product. Selling is next (unless the social and environmental impacts can occur during
product is delivered under contract), and thereafter either the iterations in the design and production-planning
the manufacturer’s or the customer’s engineers perform processes.
maintenance, repair and geriatric service, and ultimately The integration of physical and societal considerations
recycling or disposal. in the iterative process allows engineers to realize their
Each stage presents engineers with a number of op- professional obligations more fully. One can label this
tions, resulting in many possible ways of proceeding. So approach Ethics on the Feedback ¸oop. The idea of treat-
it is natural that engineers usually make an initial stab, ing engineering ethics problems as if they were design
stop along the way when they hit a snag or think of better problems was advanced by Caroline Whitbeck (1990) as
solutions, and return to some earlier stage with improve- a way to interest engineering students in ethical decision-
ments in mind. This retracing constitutes a feedback making. Michael Rabins (Harris et al., 1995) emphasizes
operation. Such a feedback path does not necessarily the role of feedback in this connection. It may be appreci-
ated then that engineers, educated and used to applying
design iterations, can readily apply this approach to
*Corresponding author. E-mail: [email protected] address in a holistic manner not only concerns of purely
physical performance, but also concerns of an ethical jects, all those who could possibly be affected by the
nature. The process must, however, take into account experiment should be contacted and afforded the oppor-
that the properties of the materials as actually delivered, tunity to give or withhold their informed consent as
the shop procedures as actually carried out, and even the participating subjects (Martin and Schinzinger, 1996,
user’s application of the final product, may not exactly p. 84). The parties affected might be workers and testers
coincide with what the designer had specified or ex- on the shop floor, shareholders of producer and buyer,
pected. The need to examine such uncertainties and their operators (say, pilots) and indirect users (airline passen-
effects on a product’s actual performance, will be exam- gers), or mere bystanders (those living below a flight
ined next. It leads to the view of engineering as an path). And even after the product has ended its useful life,
experiment with human subjects (Martin and Schinzinger, the health of people living next to recyclers, landfills, and
1996, Ch. 3). incinerators must be considered.
In today’s complex technology, few engineers could be
expected to continually keep track of a product’s many
2. Uncertainties and experimentation actual and possible uses and all the individuals affected,
but the engineering-as-experiment paradigm serves as
An engineered product (device or system) results from a reminder of what the engineer needs to keep in mind as
the interaction of human activity, tools of manufacturing, the product design undergoes yet another iteration fol-
and materials. Each of these is beset by uncertainties. The lowing preliminary design reviews and tests. The task of
designer can never know all the pertinent physical laws predicting all the ways in which a product may actually
and properties of materials, even in their ideal states. It is be misused by its owner is daunting, and no engineer can
also difficult to foresee all the possible uses to which be expected to be responsible for all unforeseen applica-
a customer may subject a product. Finally, the materials tions. Section 8 will take up some special circumstances
and parts used, as well as the manufacturing process, may involving control engineering. The emphasis here is on at
have hidden defects. Testing may not detect them, just as least taking the time to imagine possible system failures
testing of the final product does not tell how well it will or misuses, and to provide generic safety and escape
withstand unforeseen stresses, particularly when a test measures. What has been said so far will now be illus-
does not anticipate a peculiar defect, or when it does not trated by means of a case study.
carry the product to destruction.
Moving on to the use of the product, the limitations of
the product may not be clear to the original or later 3. Case study: A medical electron accelerator
owners of the product. Similar gaps may exist in the
knowledge of the end-user regarding suitable means of In the 1980s a series of tragic accidents resulted from
disposal. These are but a few of the many occurrences the use of a new radiation-therapy machine, the Therac-
which can make a product less than useful, if not outright 25 medical electron accelerator. (See Jacky, 1989;
dangerous. Leveson and Turner, 1993; Rose, 1994; Peterson, 1995.)
The significance of these uncertainties is enhanced The Therac-25 is a dual-mode linear accelerator for
when one regards engineering as an experimental activ- therapeutic use. In mode ‘‘X’’ its electron beam is directed
ity, even though engineering usually lacks the involve- at a target at full output level (25 MeV) to produce X-
ment of control groups in the usual sense. In a restricted rays for the treatment of deep-seated tumors. Shallow
sense, however, one may even regard the design process tissue treatments are carried out in the electron mode
itself as an experiment. Each iteration in the perfection of ‘‘E’’, where the beam is shaped and attenuated to variable
a design, whether one starts over again from an earlier energy levels of 5 to 25 MeV. In a third mode, an ordi-
design on paper, or after a simulation run or a prototype nary light beam and a mirror are used to simulate the
test, is an experiment which uses the preceding design as treatment beam so the patient can be positioned proper-
a ‘‘control’’ design against which one calibrates improve- ly. A turntable is employed to produce the desired treat-
ments in efficiency, cost, satisfaction of constraints, and ment modes as follows: for mode X the turntable moves
achievement of goals. into the path of the electron beam a tungsten target,
The physical realization of a design through the manu- a cone to produce a uniform treatment field, and an ion
facture and assembly of parts is likewise in many ways an chamber to measure the delivered X-ray dose. For mode
experiment, as is the selling and commissioning of the E, it replaces the above by scanning magnets to shape the
final product. Here again, one must demand the close beam, and another ion chamber to measure the delivered
attention of the experimenters (the engineers). They dose of electrons. The position of the turntable is sensed
ought to monitor the experiment (the product) through- by three microswitches and reported to the machine’s
out its life and terminate the experiment (for instance, by computer.
recalling the product) when safety can no longer be By 1987, Atomic Energy of Canada Ltd. (AECL), the
assured. As with any experiment involving human sub- manufacturer of the Therac-25, had sold and installed six
R. Schinzinger/Control Engineering Practice 6 (1998) 239—245 241
units in Canada and five in the US. Some of the model duplicate all the existing hardware safety mechanisms
25s had apparently been functioning normally since and interlocks. This approach is becoming more com-
1983, with hundreds of patients treated, when the first mon as companies decide that hardware interlocks and
known malfunction occurred in June 1985, resulting in backups are not worth the expense, or they put more
a lawsuit filed five months later. By January 1987, six faith (perhaps misplaced) on software than on hard-
patients had suffered massive overdoses of radiation dur- ware reliability.’’
ing treatment with Therac-25s. Three of these patients (Leveson and Turner, 1993)
died as a consequence of overexposure. Others lingered
in pain, and one underwent an otherwise avoidable hip As it turned out, the malfunctions of the Therac-25
replacement before dying of cancer. occurred because of one or more software errors. One
The first incident occurred during radiation treatment arose because of certain race conditions that can accom-
following a lumpectomy to remove a malignant breast pany rapid input and changing of instructions for multi-
tumor. During the twelfth session, the patient felt intense tasking operations. Thus, when set-up data was entered
heat. A dime-sized spot on her breast, along with a some- via the computer terminal, the mode (X or E) had to be
what larger exit mark on her back, indicated penetration specified. Since X was the more common treatment
by an unmodified, strong electron beam. Tim Still, the mode, the Therac operators could be expected to mis-
radiation physicist at the hospital, queried AECL, only to takenly enter ‘‘X’’ somewhat by habit, even when an ‘‘E’’
be told that the Therac-25 could not possibly have pro- was called for. But the operator could easily make a cor-
duced the specified E-mode beam without attenuating rection by hitting the up-arrow key and replacing the
and spreading it as required. The oncologist then pre- X by an E with a few keystrokes. AECL had actually
scribed continued treatment on the same Therac-25. introduced this correction feature in response to
When the cause of the initial burns was clearly identified complaints by operators that starting all over again
as radiation burns due to one or more doses in the 15,000 with data input after the error had been detected was too
to 20,000 rad range instead of the usual 200 rad, the cumbersome.
patient initiated a lawsuit against the hospital and When now a quick X-to-E correction was made by the
AECL. Eventually the patient had to have her breast operator, and when this was done within eight seconds
removed, lost control of her shoulder and arm, and after the full prescription had been entered, the X-ray
was in constant pain. Later she died in an automobile target would be withdrawn properly, but the electron
accident. beam would already have been set by the computer to its
This type of machine malfunction was to occur again, maximum energy level to comply with the earlier X-ray
but AECL was unable to replicate the events and delayed command. Thus the patient would be subjected to an
warning other Therac-25 users of the machine’s appar- excessively powerful and concentrated electron beam.
ently erratic behavior. According to Rose (1994), Dr. Still The timing was critical, the occurrences rare, and the
had discussed his early observations with colleagues in the cause only detected with difficulty by the hospital’s
profession, with the result that AECL had warned him not medical physicist.
to spread unproven and potentially libelous information.
the software is being written and installed. Thus it is patient (Cheng and Kubo, 1988; Loyd et al., 1989). An
always important to provide a safety mechanism that is assumed dose based on calculations involving the pre-
truly independent of the control computer and its pro- scription and presumably correct machine settings, or
gram. The mechanical interlocks used on the Therac-20 a reading derived from a dosimeter that is not even
would be an example. exposed to the actually delivered radiation, is not suffi-
cient. Direct-reading, well-calibrated dosimeters are a vi-
tal element in the feedback of information from the
6. Safe exit treatment table to the operator. After all,
‘‘[The] dose monitoring system is the last level of
Let us now look at one more report of a radiation
protection between the patient and the extremely
overdose from a Therac-25:
high dose rate which all accelerators are capable of
‘‘[A] patient’s treatment was to be a 22-MeV treat- producing’’.
ment of 180 rad over a 10]17 cm field 2 , or a total (Galbraith et al., 1990)
of 6,000 rad over a period of 61 weeks2 . After-the-
2 As reported by Rose (1994), staff at a hospital in
fact simulations of the accident revealed possible doses
Toronto installed a dose-per-pulse monitor that could
of 16,500 to 25,000 rad in less than 1 second over an
measure the radiation delivered by a beam and in a frac-
area of about 1]1 cm2 . He died from [horrible]
tion of a second shut down the machine.
complications of the overdose five months after the
In case of serious mishaps, corrective action can be
accident.’’
undertaken that much sooner when proper instrumenta-
(Leveson and Turner, 1993)
tion is available. Equally important is the presence of
Other aspects of this particular incident are note- experts at potentially life-threatening treatment or work
worthy because the machine’s operators had to rely on sites. Three Mile Island, Bhopal, and Chernobyl have
their own best guesses when abnormalities occurred. The shown that good measurements and the presence of on-
natural tendency would often be to shrug off unusual site experts who can evaluate the data are the sine qua
machine behavior as ‘‘just another one of its quirks’’. But non of safety and the avoidance of disaster (Martin and
the consequences were serious. For instance, soon after Schinzinger, 1996, pp. 168—170). Beyond that, a ‘‘safe
hitting the proper key to begin treatment of the patient exit’’ directly accessible to the patient (or, in general, the
mentioned above, the machine stopped and the console ‘‘ultimate subject of the experiment’’) must be provided.
displayed the messages ‘‘Malfunction 54 (dose input 2)’’
and ‘‘Treatment Pause’’. There was no explanation to be
had what kind of ‘‘Malfunction’’ this was, not even in the 7. The regulatory agency
manuals, though later inquiry from the manufacturer
indicated that ‘‘dose input 2’’ meant a wrong dose, either It is appropriate to introduce here another line of
too low or too high. The ‘‘Pause’’ message meant a prob- defense, one for early prevention, even though it may
lem of low priority. The dose monitor showed only appear to be remote from the engineer and the user, and
6 units of radiation delivered, when 202 had been speci- that is the regulatory agency. When the early Therac-25
fied. The operator was used to erratic behavior of the accidents occurred, it was (in the United States) up to the
machine, and since on earlier occasions the only conse- manufacturer to report malfunctions of radiotherapy
quences had been inconvenience, she continued the treat- equipment to the US Food and Drug Administration
ment. Soon she was horrified to hear the patient pound (FDA). The user was not obliged to do so. Those users
on the door of the treatment room. He had received what who did call the FDA, if only to learn what experience
felt like two severe electric shocks, apparently one after other cancer centers might have had with the Therac-25,
each time the start button had been pushed. This was his could not find out anything, because AECL had not yet
ninth session, so he knew something was wrong. reported. Industry engineers, fearful of giving away trade
Why did he have to get off the table and pound on the secrets or of inviting law suits, are too often reluctant to
door? Because even the simplest of emergency cords or share information on malfunctions. Recall procedures
buttons or other ‘‘safe exits’’ were lacking (Martin and are in force in the health equipment industry, but time
Schinzinger, 1996, p. 179). Instead, in the case under lags often hinder effective implementation. This state of
discussion, audio and video monitors were provided. As affairs should remind us that as responsible engineers we
could be expected to happen at times, the audio monitor should assist the regulatory process and improve it, in-
was broken and the video monitor had been unplugged, stead of labeling it a nuisance and trying to impede it. At
but the radiation treatment was conducted anyway. the same time, it must be recognized that regulations per
More to the point is the general lack throughout the se often do not keep up with changes in technology and
industry of accurate, reliable instruments that tell that adherence to regulations may also lead to merely
an operator the actual radiation dose delivered to the minimal compliance. What regulations — and laws in
R. Schinzinger/Control Engineering Practice 6 (1998) 239—245 243
has not been thought of by the task assigners falls be- themselves control it. Hoping that there is some other
tween the cracks or is written off as something that can engineer down the line of production who has been
be postponed. Worse yet, if a safety problem originates in designated to check for errors is not sufficient, because
a different department, or if it could be readily remedied even if there is, the error may still be overlooked again. It
there, it is very difficult to obtain quick resolutions via is necessary to personally bring such cases to the atten-
official, interdepartmental channels. Direct, personal tion of colleagues and superiors. They, in turn, must be
links among engineers work much better. receptive to such reports of concern. If a recall is necessary
Given the specificity of today’s engineering tasks, en- — from the design office, the shop floor, or the customer’s
gineers and their managers may want to consider other premises — it had best occur as early as possible!
attributes necessitated by the very complication of the An ethical corporate climate helps, of course. But it is
modern work environment. Such strengths as integrity, particularly important not to fall into the trap of thinking
foresightedness, autonomous ethical decision-making, that an organization could rely on conveniently legalistic
and moral courage are essential to the character of the compliance strategies. These appear to be much favored
responsible engineer. by lawyers and executives, who can then lay sole blame
Good managers avoid fragmentation of the workplace, for organizational failures on individuals who have
but where rivalries between departments or agencies are supposedly acted contrary to the organization’s rules,
great, it persists. The story of the Challenger disaster, for whether these actually promote ethical behavior or not.
instance, tells how the several engineering groups and Specific compliance rules are suitable only in very struc-
their counterparts at NASA sites felt differently about tured settings such as purchasing and contracting. Gen-
risks in general, and about proceeding with the shuttle’s erally, a philosophically based ethics strategy is more
launch under problematic conditions in particular (Feyn- effective for the many ‘‘experimentally’’ based, open-
man, 1988). Engineers have to muster moral courage if ended functions. Such a strategy is based on autonomous
they want to overcome such barriers and stand up for ethical decision making, not on mere observance of laws
what is the correct thing to do: for example, to protect the and regulations. This strategy is also the best way to
right of the owner or operator of a product not to be impress on engineers that responsibility for success or
harmed by its use, or at least to issue timely warnings of failure is mostly not divisible, on the job nor elsewhere in
potential hazards to those most directly affected (includ- life.
ing the captain and crew of a ship or space shuttle). The There is another problem that employed engineers
ability to do that is the true mark of professionalism and face, and that has to do with the fact that managers and
a test of a professional’s integrity. engineers may differently interpret information coming
The concept of integrity is important to good engineer- by way of the feedback loop. This was pointed out at the
ing in several contexts. The design process must exhibit IFAC Congress by panelist Mike Martin (1996) who
integrity, in that it must recognize that no single function gave the Challenger case as an example. O-rings that
can be governed well without consideration of all other should seal segments of the booster rockets had shown
functions. The product itself must exhibit integrity, in signs of erosion after prior launchings at low temper-
that it must function well with respect to several at- atures. A redesign of the rocket was already underway,
tributes, such as efficiency, economy, safety, and reliabil- but so far a recall had not been issued. The engineers
ity, while at the same time being environmentally benign asked that the spaceship should not be launched at the
and aesthetically pleasing. It would also be difficult to very low temperatures that were expected at the launch
associate the concept of integrity with a product for site. Management (also engineers, but wearing their
killing or maiming humans, such as land mines, or a sol- ‘‘management hats’’) interpreted the experiences with
vent to be exported knowingly for use in preparing illicit prior launches differently: no launches had failed, and as
cocaine. Finally, integrity should be recognized as a main long as failure could not be forecast with certainty, the
attribute of character — the character of the engineer. planned launch should proceed. The engineers were not
Integrity implies that the engineer as a matter of habit prepared for such a response, nor were they uniformly
feels accountable for her or his work, and therefore ex- firm in standing up to management pressure. So the
hibits the usual attributes of ethical conduct. Another ‘‘experiment’’ was carried out, apparently without the
important attribute of character is foresight, the ability commander and the crew of the Challenger being notified
and the effort to look ahead. It quite naturally leads to of the situation — an example of ‘‘experimentation with-
the exercise of caution, an indispensable ingredient of out consent of the subjects involved’’.
responsible experimentation.
Foresight should encompass the whole range of tech-
nological activities, from design to recycling and the 10. Conclusion
effects of external influences. It also means that it is not
right for engineers to disregard any problematic design The title Ethics on the Feedback ¸oop had the purpose
feature or manufacturing process, even if they do not of drawing attention to the possibility of acting on ethical
R. Schinzinger/Control Engineering Practice 6 (1998) 239—245 245