Cyberethics is the study of moral, legal, and social issues arising from cybertechnology, which encompasses various computing and communication devices. The document outlines the evolution of cybertechnology through four phases, highlighting associated ethical concerns such as privacy, intellectual property, and the impact of the Internet. It also discusses differing perspectives on cyberethics, including professional, philosophical, and sociological approaches to understanding and addressing these issues.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
8 views
ch01
Cyberethics is the study of moral, legal, and social issues arising from cybertechnology, which encompasses various computing and communication devices. The document outlines the evolution of cybertechnology through four phases, highlighting associated ethical concerns such as privacy, intellectual property, and the impact of the Internet. It also discusses differing perspectives on cyberethics, including professional, philosophical, and sociological approaches to understanding and addressing these issues.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41
What Is Cyberethics?
◼ Cyberethics is the study of moral, legal, and
social issues involving cybertechnology. ◼ As a field of applied ethics, it: ➢ examines the impact that cybertechnology has for our social, legal, and moral systems. ➢ evaluates the social policies and laws that we frame in response to issues generated by the development and use of cybertechnology. What Is Cybertechnology? ◼ Cybertechnology refers to a wide range of computing and communications devices – from standalone computers, to "connected" or networked computing and communications technologies, to the Internet itself. ◼ Cybertechnologies include: ➢ digital electronic devices; ➢ networked computers (including servers, desktops, laptops, etc.); ➢ stand-alone computers. Cybertechnology (Continued) ◼ Networked devices can be connected directly to the Internet. ◼ They also can be connected to other devices through one or more privately owned computer networks. ◼ Privately owned networks include both: ➢ Local Area Networks (LANs), ➢ Wide Area Networks (WANs). Why the term cyberethics? ◼ Cyberethics is a more accurate label than computer ethics, which can suggest the study of ethical issues limited either to: a) computing machines, b) computing professionals. ◼ Cyberethics is also more accurate than Internet ethics, which is limited only to ethical issues affecting (only) networked computers and devices. The Evolution of Cybertechnology and Cyberethics: Four Phases ◼ Computer technology emerged in the late 1940s, when some analysts confidently predicted that no more than six computers would ever need to be built. ◼ The first phase of computing technology (1950s and 1960s) consisted mainly of huge mainframe computers that were unconnected (i.e., stand-alone machines). ◼ One ethical/social question that arose during Phase 1 dealt with the impact of computing machines as “giant brains” and what that meant for being human. ◼ Another question raised during this phase concerned privacy threats and the fear of Big Brother. The Evolution of Cybertechnology and Cyberethics (Continued) ◼ In Phase 2 (1970s and 1980s), computing machines and communications devices began to converge. ◼ Mainframe computers and personal computers could be linked together via privately owned networks, which generated three kinds of ethical/social issues: 1) privacy concerns (introduced in Phase 1) were exacerbated because confidential information could easily be exchanged between networked databases. 2) intellectual property issues emerged because personal computers could easily be used to duplicate and exchange proprietary software programs. 3) computer crime emerged because “hackers” could break into the computers of large organizations. The Evolution of Cybertechnology and Cyberethics (Continued) ◼ During Phase 3 (1990-present), the availability of Internet access to the general public has increased significantly. ◼ This has been facilitated by the phenomenal growth of the World Wide Web. ◼ The proliferation of Internet- and Web-based technologies in this phase has raised ethical and social concerns affecting: ➢ free speech, ➢ anonymity, ➢ jurisdiction. The Evolution of Cybertechnology and Cyberethics (Continued) ◼ In Phase 4 (present to near future), “Web 2.0” has made possible the proliferation of social networking sites (SNSs), such as Facebook and Twitter. ◼ As cybertechnology continues to evolve in Phase 4, computers will likely become more and more a part of who or what we are as human beings. ➢ For example, Moor (2005) notes that computing devices will soon be a part of our clothing, and even our bodies. ◼ Computers are already becoming ubiquitous, and are beginning to “pervade” both our work and recreational environments. ◼ Objects in these environments already exhibit what Brey (2005) calls “ambient intelligence,” which enables “smart objects” to be connected via wireless technology. The Evolution of Cybertechnology and Cyberethics (Continued) ◼ In Phase 4, computers are becoming less visible as distinct entities, as they: a) continue to be miniaturized and integrated into ordinary objects, b) blend unobtrusively into our surroundings. ◼ Cybertechnology is also becoming less distinguishable from other technologies as boundaries that have previously separated them begin to blur because of convergence. The Evolution of Cybertechnology and Cyberethics (Continued) ◼ Additional ethical/social concerns associated with Phase IV include controversies that are made possible by the following kinds of technologies: ◼ autonomous machines and sophisticated robots (used in warfare, transportation, care for the elderly, etc.); ◼ nanocomputing and nano-scale devices; ◼ artificial agents (including “soft bots”) that act on behalf of humans and corporations; ◼ AI-induced bionic chip implants (that can cause us to question what it means to be human vs. cyborg). Table 1-1: Summary of Four Phases of Cyberethics Phase Time Period Technological Features Associated Issues
2 1970s-1980s Minicomputers and PCs Issues from Phase 1 plus
interconnected via privately owned concerns involving intellectual networks property and software piracy, computer crime, privacy and the exchange of records. 3 1990s-Present Internet and World Wide Web Issues from Phases 1 and 2 plus concerns about free speech, anonymity, legal jurisdiction, virtual communities, etc. 4 Present to Convergence of information and Issues from Phases 1-3 plus communication technologies with concerns about artificial agents Near Future nanotechnology research and ("bots") with decision-making bioinformatics research, etc. capabilities, AI-induced bionic chip implants, nanocomputing, pervasive computing, etc. Are Any Cyberethics Issues Unique Ethical Issues? ◼ Review the Mobile-Phone Hacking incident (Scenario 1-1 in the textbook). ◼ Is there anything new or unique, from an ethical point of view, about the ethical issues that emerge in this scenario? ◼ On the one hand, some high profile celebrities was harrassed in ways that were not possible in the pre-Internet era. ◼ But are any new or any unique ethical issues generated in this scenario? Debate about the Uniqueness of Cyberethics Issues (Continued) ◼ There are two points of view on whether cybertechnology has generated any new or unique ethical issues: 1. Traditionalists argue that nothing is new – crime is crime, and murder is murder. 2. Uniqueness Proponents argue that cybertechnology has introduced (at least some) new and unique ethical issues that could not have existed before computers. The Uniqueness Debate (Continued) ◼ Both sides seem correct on some claims, and both seem to be wrong on others. ◼ Traditionalists underestimate the role that issues of scale and scope that apply because of the impact of computer technology. ➢ For example, cyberbullies can bully multiple victims simultaneously (scale) and globally (because of the scope or reach of the Internet). ➢ Cyberbullies can also operate without ever having to leave the comfort of their homes. The Uniqueness Debate (Continued) ◼ Those who defend the Uniqueness thesis tend to overstate the effect that cybertechnology has on ethics per se. ◼ Maner (2004) correctly points out that computers are uniquely fast, uniquely malleable, etc. ◼ So, there may indeed be some unique aspects of computer technology. The Uniqueness Debate (Continued) ◼ Proponents of the uniqueness thesis tend to confuse unique features of tcomputer technology with unique ethical issues. ◼ Their argument is based on a logical fallacy: Premise. Cybertechnology has some unique technological features. Premise. Cybertechnology generates some ethical issues. Conclusion. (At least some of the) Ethical issues generated by cybertechnology must be unique. The Uniqueness Debate (Continued) ◼ Traditionalists and uniqueness advocates are each partly correct. ◼ Traditionalists correctly point out that no new ethical issues have been introduced by computers. ◼ Uniqueness proponents are correct in that cybertechnology has complicated our analysis of traditional ethical issues. The Uniqueness Debate (Continued) ◼ So, in analyzing the issues involved in this debate, it is useful to distinguish between any: ➢ unique technological features;
➢ (alleged) unique ethical issues.
The Uniqueness Debate (Continued) ◼ Consider Scenarios 1-2 and 1-3 (in the textbook) which involve: a) computer professionals responsible for designing the software code for a controversial computer system; b) ordinary users making unauthorized copies of proprietary software. ◼ Are any of the ethical issues that arise in either scenario unique ethical issues? Alternative Strategy for Analyzing the Uniqueness Issue ◼ Moor (2000) argues that computer technology generates “new possibilities for human action” because computers are logically malleable. ◼ Logical malleability in computers means that they can be molded in ways that allow for many different kinds of uses. ◼ Some of the unanticipated uses of com- puters have introduced policy vacuums. Policy Vacuums and Conceptual Muddles ◼ Policy vacuums are “voids” or gaps in our laws and policies. ◼ One solution might seem simply to fill the voids with new or revised policies. ◼ Some policy vacuums cannot easily be filled because of conceptual muddles. ◼ In these cases, conceptual muddles first need to be elucidated before clear policies can be formulated and justified. A Policy Vacuum in Duplicating Software ◼ Consider again Scenario 1-3 (in the textbook) involving the duplication of software. ◼ In the early 1980s, there were still no clear laws regarding the duplication of software programs, which had been made easy because of the avaioability of personal computers. ◼ Because there were no clear rules for copying programs, a policy vacuum arose. ◼ Before the policy vacuum could be filled, a conceptual muddle had to be elucidated: What, exactly, is software? Cyberethics as a Branch of Applied Ethics ◼ Applied ethics, unlike theoretical ethics, examines "practical" ethical issues. ◼ It analyzes moral issues from the vantage- point of one or more ethical theories. ◼ Ethicists working in fields of applied ethics are more interested in applying ethical theories to the analysis of specific moral problems than in debating the ethical theories themselves. Cyberethics as a Branch of Applied Ethics (continued) ◼ Three distinct perspectives of applied ethics (as applied to cyberethics): ➢ Professional Ethics; ➢ Philosophical Ethics; ➢ Sociological/Descriptive Ethics. Perspective # 1: Cyberethics as a Branch of Professional Ethics ◼ According to this view, the purpose of cyberethics is to identify and analyze issues of ethical responsibility for computer/information technology (IT)professionals. ◼ Consider a computer professional's role in designing, developing, and maintaining computer hardware and software systems. ◼ Suppose a programmer discovers that a software product she has been working on is about to be released for sale to the public, even though it is unreliable because it contains “buggy” software. ◼ Should she “blow the whistle”? Professional Ethics ◼ Gotterbarn (1995) has suggested that computer ethics issues are professional ethics issues. ◼ Computer ethics, for Gotterbarn, is similar to medical ethics and legal ethics, which are tied to issues involving specific professions. ◼ He notes that computer ethics issues aren’t, strictly speaking, about technology per se. ➢ For example, he point out that we don’t have automobile ethics, airplane ethics, etc. Some Criticisms of the Professional Ethics Perspective ◼ Is Gotterbarn’s model for computer ethics too narrow for cyberethics? ◼ Consider that cyberethics issues affect not only computer professionals; they effect evirtually veryone. ◼ Before the widespread use of the Internet, Gotterbarn’s professional- ethics model may have been adequate. Perspective # 2: Philosophical Ethics ▪ From this perspective, cyberethics is a field of philosophical analysis and inquiry that goes beyond professional ethics. ▪ Moor (2000) defines computer ethics as: ...the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such technology. [Italics Added.] Philosophical Ethics Perspective (continued) ◼ Moor argues that automobile and airplane technologies did not affect our social policies and norms in the same kinds of fundamental ways that computer technology has. ◼ Automobile and airplane technologies have revolutionized transportation, resulting in our ability to travel faster and farther than was possible in previous eras. ◼ But they did not have the same impact on our legal and moral systems as cybertechnology. Philosophical Ethics: Standard Model of Applied Ethics ◼ Brey (2004) describes the “standard methodology” used by philosophers in applied ethics research as having three stages: 1) Identify a particular controversial practice as a moral problem. 2) Describe and analyze the problem by clarifying concepts and examining the factual data associated with that problem. 3) Apply moral theories and principles to reach a position about the particular moral issue. Perspective #3: Cyberethics as a Field of Sociological/Descriptive Ethics ◼ The professional and philosophical perspectives both illustrate normative inquiries into applied ethics issues. ◼ Normative inquiries or studies are contrasted with descriptive studies. ◼ Descriptive (and sociological) investigations report about “What is the case.“ ◼ Normative inquiries evaluate situations from the vantage-point of the question: “What ought to be the case?”. Sociological/Descriptive Ethics Perspective (continued) ◼ Review Scenario 1-4 (in the textbook) involving the impact of the introduction of a new technology on a community’s workforce. ◼ Suppose that a new technology, Technology X, displaces 8,000 workers in Community Y. ◼ If we analyze the issues solely in terms of their sociological dimension, including the number of jobs that were gained or lost in that community, our investigation would be essentially descriptive in nature. Some Benefits of Using the Sociological/Descriptive Approach ◼ Huff and Finholt (1994) claim that when we understand the descriptive aspect of social effects of technology, the normative ethical issues become clearer. ◼ The descriptive/sociological perspective can prepare us for our subsequent (normative) analysis of the ethical issues that affect our system of policies and laws. Table 1-2: Summary of Cyberethics Perspectives Type of Perspective Associated Issues Examined Disciplines
Professional Computer Science Professional Responsibility
Engineering System Reliability/Safety Library/Information Codes of Conduct Science Philosophical Philosophy Privacy & Anonymity Law Intellectual Property Free Speech
Sociological/Descriptive Sociology Impact of cybertechnology
Behavioral Sciences on governmental/financial/ educational institutions and socio-demographic groups A "Disclosive" Method for Cyberethics ◼ Brey (2004) believes that because of embedded biases in cybertechnology, the standard applied- ethics methodology is not adequate for identifying cyberethics issues. ➢ For example, Brey notes that we might fail to notice certain features embedded in the design of cybertechnology. ➢ Using the standard model, we might also fail to recognize that certain practices involving cybertechnology can have moral implications. Disclosive Method (Continued) ◼ Brey points out that one weakness of the “standard method of applied ethics” is that it tends to focus on known moral controversies ◼ So, that model fails to identify practices involving cybertechnology which have moral implications but that are not yet known. ◼ Brey refers to these practices as having morally opaque (or morally non-transparent) features, which he contrasts with "morally transparent” features. Figure 1-2: Embedded Technological Features Having Moral Implications
Transparent Features Morally Opaque Features
Known Features Unknown Features
Users are aware of Users are not even these features but do aware of the not realize they have technological features moral implications. that have moral implications
Examples can Examples might
include:Web Forms include data-mining and search- technology and engine tools. Internet cookies. A Multi-Disciplinary and Multi-Level Method for Cyberethics
◼ Brey’s disclosive method is
multidisciplinary because it requires the collaboration of: ➢ computer scientists,
➢ philosophers,
➢ social scientists. A Multi-Disciplinary & Multi-Level Method for Cyberethics (Continued)
◼ Brey’s scheme is also multi-level
because the method for conducting computer ethics research requires three levels of analysis, i.e., a: ➢ disclosure level, ➢ theoretical level, ➢ application level. Table 1-3: Three Levels in Brey’s Model of Computer Ethics Level Disciplines Involved Task/Function Disclosive Computer Science Disclose embedded Social Science features in computer (optional) technology that have moral import
Theoretical Philosophy Test newly disclosed
features against standard ethical theories
Application Computer Science Apply standard or
Philosophy newly revised/ Social Science formulated ethical theories to the issues A Three-step Strategy for Approaching Cyberethics Issues Step 1. Identify a practice involving cyber-technology, or a feature in that technology, that is controversial from a moral perspective. 1a. Disclose any hidden (or opaque) features or issues that have moral implications 1b. If the ethical issue is descriptive, assess the sociological implications for relevant social institutions and socio-demographic and populations. 1c. If the ethical issue is also normative, determine whether there are any specific guidelines, that is, professional codes that can help you resolve the issue (see Appendixes A-E).
1d. If the normative ethical issues remain, go to Step 2.
Step 2. Analyze the ethical issue by clarifying concepts and situating it in a context. 2a. If a policy vacuums exists, go to Step 2b; otherwise go to Step 3. 2b. Clear up any conceptual muddles involving the policy vacuum and go to Step 3 . Step 3. Deliberate on the ethical issue. The deliberation process requires two stages: 3a. Apply one or more ethical theories (see Chapter 2) to the analysis of the moral issue, and then go to step 3b. 3b. Justify the position you reached by evaluating it against the rules for logic/critical thinking (see Chapter 3).
Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th Edition 5th Edition – Ebook PDF Version All Chapters Instant Download
Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th Edition 5th Edition – Ebook PDF Version - Download the full set of chapters carefully compiled
Download Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th Edition 5th Edition – Ebook PDF Version ebook All Chapters PDF
[FREE PDF sample] Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th Edition 5th Edition – Ebook PDF Version ebooks
(Ebook) History of the Concept of Time: Prolegomena by Martin Heidegger ISBN 9780253207173, 0253207177 - The latest updated ebook version is ready for download