0% found this document useful (0 votes)
5 views

Professional Ethics - Lecture 6 - Failures in Computer Systems

Uploaded by

Xepheus Messorem
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Professional Ethics - Lecture 6 - Failures in Computer Systems

Uploaded by

Xepheus Messorem
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Professional Ethical, Legal and

Societal Implications of
Computer Systems
Errors, Failures & Risks
Week 6
Sources: A Gift of Fire, 3rd edition, Sara Baase, Chapter 8
Computer Ethics, 3rd edition, Deborah Johnson, Chapter 7
What We Will Cover
 Failures and Errors in Computer Systems
 Case Study: The Therac-25
 Increasing Reliability and Safety
 Accountability and ICT
 Key Concepts:
 Accountability

 Types of Responsibility

 Negligence

 Software: Product or Service?


 Dependence, Risk, and Progress
Failures and Errors in
Computer Systems
 Most computer applications are so complex it is
virtually impossible to produce programs with no
errors
 The cause of failure is often more than one factor
 Computer professionals must study failures to learn
how to avoid them
 Computer professionals must study failures to
understand the impacts of poor work
Failures and Errors in
Computer Systems (cont.)
Individual Problems:
 Billing errors
 Inaccurate and misinterpreted data in databases
 Large population where people may share names
 Automated processing may not be able to recognize special
cases
 Overconfidence in the accuracy of data
 Errors in data entry
 Lack of accountability for errors
Failures and Errors in
Computer Systems (cont.)
System Failures:
 AT&T, Amtrak, NASDAQ

 Businesses have gone bankrupt after spending

huge amounts on computer systems that failed


 Voting system in 2000 presidential election

 Denver Airport

 Ariane 5 Rocket
Failures and Errors in
Computer Systems (cont.)
Denver Airport:
 Baggage system failed due to real world problems,
problems in other systems and software errors
 Main causes:
 Time allowed for development was insufficient
 Denver made significant changes in specifications after the
project began
Failures and Errors in
Computer Systems (cont.)

High-level Causes of Computer-System Failures:


 Lack of clear, well thought out goals and specifications
 Poor management and poor communication among
customers, designers, programmers, etc.
 Pressures that encourage unrealistically low bids, low budget
requests, and underestimates of time requirements
 Use of very new technology, with unknown reliability and
problems
 Refusal to recognize or admit a project is in trouble
Failures and Errors in
Computer Systems (cont.)
Safety-Critical Applications:
 A-320: "fly-by-the-wire" airplanes (many systems are controlled
by computers and not directly by the pilots)
 Between 1988-1992 four planes crashed
 Air traffic control is extremely complex, and includes computers
on the ground at airports, devices in thousands of airplanes,
radar, databases, communications, and so on - all of which must
work in real time, tracking airplanes that move very fast
 In spite of problems, computers and other technologies have
made air travel safer
Case Study: The Therac-25
Therac-25 Radiation Overdoses:
 Massive overdoses of radiation were given; the
machine said no dose had been administered at all
 Caused severe and painful injuries and the death of
three patients
 Important to study to avoid repeating errors
 Manufacturer, computer programmer, and
hospitals/clinics all have some responsibility
Case Study: The Therac-25
(cont.)
Software and Design problems:
 Re-used software from older systems, unaware of
bugs in previous software
 Weaknesses in design of operator interface
 Inadequate test plan
 Bugs in software
 Allowed beam to deploy when table not in proper position
 Ignored changes and corrections operators made at
console
Case Study: The Therac-25
(cont.)
Why So Many Incidents?
 Hospitals had never seen such massive overdoses before, were
unsure of the cause
 Manufacturer said the machine could not have caused the
overdoses and no other incidents had been reported (which was
untrue)
 The manufacturer made changes to the turntable and claimed
they had improved safety after the second accident. The
changes did not correct any of the causes identified later
Case Study: The Therac-25
(cont.)
Why So Many Incidents? (cont.)
 Recommendations were made for further changes
to enhance safety; the manufacturer did not
implement them
 The FDA declared the machine defective after the
fifth accident
 The sixth accident occurred while the FDA was
negotiating with the manufacturer on what changes
were needed
Case Study: The Therac-25
(cont.)
Observations and Perspective:
 Minor design and implementation errors usually occur in complex
systems; they are to be expected
 The problems in the Therac-25 case were not minor and suggest
irresponsibility
 Accidents occurred on other radiation treatment equipment
without computer controls when the technicians:
 Left a patient after treatment started to attend a party
 Did not properly measure the radioactive drugs
 Confused micro-curies and milli-curies
Case Study: The Therac-25
Discussion Question
 If you were a judge who had to assign responsibility
in this case, how much responsibility would you
assign to the programmer, the manufacturer, and the
hospital or clinic using the machine?
Increasing Reliability and Safety
What goes Wrong?
 Design and development problems
 Management and use problems
 Misrepresentation, hiding problems and inadequate
response to reported problems
 Insufficient market or legal incentives to do a better
job
 Re-use of software without sufficiently
understanding the code and testing it
 Failure to update or maintain a database
Increasing Reliability and
Safety (cont.)
Professional techniques:
 Importance of good software engineering and professional
responsibility
 User interfaces and human factors
 Feedback
 Should behave as an experienced user expects
 Workload that is too low can lead to mistakes
 Redundancy and self-checking
 Testing
 Include real world testing with real users
Increasing Reliability and
Safety (cont.)
Law, Regulation and Markets:
 Criminal and civil penalties
 Provide incentives to produce good systems, but should not
inhibit innovation
 Warranties for consumer software
 Most are sold ‘as-is’
 Regulation for safety-critical applications
 Professional licensing
 Arguments for and against
 Taking responsibility
Accountability and ICT
(Computer Ethics, Johnson)
 Accountability – person or organization that is
the appropriate agent to respond (i.e. to give
an account) in a situation
 4 types of responsibility:
 Role-responsibility
 Causal responsibility
 Blame-worthiness
 Liability
Accountability and ICT
 Role-responsibility: interchangeable with duty –
refers to what an individual to expected to do in a
social role, e.g. as an employee, parent, citizen,
friend, professional
 Some role responsibilities are codified, e.g. paying
tax as a citizen. Most role responsbilities usually
assumed based on shared social-cultural
understanding. Often not explicitly articulated until
someone fails to fulfill responsibility.
 Roles entail responsibilities that may not come into
play until a special situation arises, e.g. parent
becomes ill, brother goes bankrupt
Accountability and ICT
 Causal
Responsibility - person responsible
because individual did something (or failed to
do something)
 Attributed to nonhuman events as well as
persons, e.g. hurricane blew roof from house
 In most situations, the event is the result of a
multitude of factors but a single factor picked out
as ‘the cause’.
 N.B. software disasters often due to more than 1
party (individuals and/or organizations).
Accountability and ICT
 Blameworthiness – doing something wrong
or being at fault
 Person who caused a situation is not
necessarily blameworthy
A financial advisor who causally responsible for
pension fund mony losing its value in context of
stock market crash may not be considered
blameworthy
 Madoff who defrauded investors is blameworthy
Accountability and ICT
 Liability – legal treatment of certain events
 to be legally liable, person who must pay
damages or compensate when certain situations
occur
 If blameworthy, usually liable, e.g. car accidents
 Can be liable without being blameworthy or
causally responsible
 E.g. liability associated with home ownership. If
someone hurts themself at your house, you could be
liable to pay for treatment and any loss of wages due
to the injury
Accountability and ICT
 Legal liability is often, though not always, tied to one
of the other senses of responsibility
 Strict liability is liability ‘without fault’ –
required to pay damages even though she/ he/
organization did nothing wrong
 Conversial because it goes counter to a common moral
intuition that individuals shouldn’t be held responsible if
they could not prevent or control what happens
 Used in law to encourage individuals to take precautions to
make things safe.
 Intriguing concept when applied to software
Accountability and ICT
 Computer case – 4 types of responsibility:
 When sell a software package to customer x, the vendor gives
inaccurate info about what it can do. Customer x has employees
install the package; they use it for a while and then it crashes. Crash
permamently destroys data and disrupts customer’s X’s business
for some weeks leading to considerable loss of income.
 Software vendor may be liable to refund for price of
software and compensate for losses, dependent on three
other types of responsbility:
 If vendor gave incorrect information about software package, failed
to fulfill role-resonsibility
 Blameworthy if knew the truth and withheld it (dishonest)
 Vendor causally responsible if disruption of business was not
caused by something else, e.g. employees did not know how to use
software or did not pay attention to vendor’s instructions
Software: Product or Service?
 Creation of Software brings legal challenges
 Software challenged patent and copyright law
 Software challenged liability law

 In many legal system, selling products


treated differently from provision of services.
So conceptual puzzle arises as to whether it
software should be treated as product or
service.
Software: Product or Service?
3 different ways that software can be sold:
 Mass-marketed (like buying suit ready to wear)
 Designed for specific user (like tailor made suit)
 Buy a software system and modify it to meet
company’s requirements (like alteration of ready
to wear suit)
 Prince (1980) recommends:
 Treat mass-marketed software as products
 Treat customized software as service
 Treat other case as mixed (product and service)
Software: Product or Service?
 If software is treated as product, strict liability can be
imposed (liability without fault). Is this reasonable?
In what scenarios?
 Using an utilitarianism perspective, there are 3
factors to justify strict liability:
 Item placed in stream of commerce to earn profit, and as
such producer should bear risk of loss or injury
 Producer in better position than anyone else to anticipate
and control risks
 Producer in best position to spread the cost of injury over
all buyers by building it into the cost of product. As such,
burden of injury is spread to all those who buy
Software: Product or Service?
 Strict liability does not make sense for customized
software because
 it is not placed in the stream of commerce since designed
for particular client by request
 Producer of customized software may not be in best
position to determine the risks of software or to distribute
the costs of paying damages
 Strict liability makes sense for mass-marketed
software all 3 factors apply:
 Placed in stream of commerce
 Good position to assess risks
 Able to distribute costs to buyers of software
Software: Product or Service?
 Analysis adequate for civil law
 Holding individuals or companies strictly liable can have
good effects – by knowing that they could pay damages
when software faulty, they are more likely to take
precautions to release safe and reliable software
 Different analysis needed for criminal cases
 If people were placed in jail due to release of faulty
products, though the quality of software would likely
increase, it might have a negative effect of reducing
number of people willing to put software into stream of
commerce
Accountability and ICT
 Negligence - failure to do something that a prudent person
would have done.
 Presumes standard of behavior for individual engaged in a
particular activity
 Describes blameworthy behavior of group
 E.g., Software engineers should design software that does not
fail in routine conditions
 Negligence difficult to determine when role-responsibility unclear,
e.g. testing according to current standards
 How up-to-date should software engineer be to be considered
competent?
 To prosecute for negligence, prevailing standards must be
identifiable and then members of the professional group called in
to testify about these standards in the field in court of law
Accountability and ICT
 Parallels between reliable software, automobile
safety and adequate health care
 Often expert knows how to do things above and
beyond standards but knowledge is not put into use
because of cost or risk
 Experts not considered derelict when they opt not to
do everything possible to make software more
reliable or cars safer or health care more extensive.
In all three cases, trade-offs are involved.
 Yet diligence to meet standards are important.
 Legal and moral responsibility not the same.
Diffusion of Accountability
 3 factors related to diffusion of accountability
 Scale and complexity of computer systems
 “many” hands involved in developing, distributing and
using compter systems
 Way in which computers sometimes mediate human
decision-making
 Who will be responsible when failure of systems?
 Computers (inantimate objects) cannot be moral agents
 Humans beings can be assigned responsibility using
different responsibilities:
 role-responsibility
 causal responsibility
 blameworthiness
 liability
Diffusion of Accountability
 Moral responsibility – humans are responsible for
actions they choose because of autonomy (ability to
think, decide, and act based on own decision)
 With large computer systems, since single individual
built the system (and thus no one person is likely to
understand in detail what is going on in system),
more than 1 person or organization is likely to be
responsible
 Software disasters/ failures may due to many roles:
 Design errors (which may or may not be compounded by)
 coding errors
 documentation errors
 advertising errors
Dependence, Risk, and
Progress
 Are We Too Dependent on Computers?
 Computers are tools
 They are not the only dependence
 Electricity

 Risk and Progress


 Many new technologies were not very safe when they were first
developed
 We develop and improve new technologies in response to
accidents and disasters
 We should compare the risks of using computers with the risks of
other methods and the benefits to be gained
Dependence, Risk, and Progress
Discussion Questions

 Do you believe we are too dependent on


computers? Why or why not?
 In what ways are we safer due to new
technologies?

You might also like