0% found this document useful (0 votes)
43 views8 pages

Ethics For Technologists

This article summarizes an interview with Marc Steen about ethics for technologists. Some key points: - Steen wrote a book on ethics for technology practitioners to make ethics more practical and accessible for those developing technologies. - For developers, important ethical issues include data privacy, bias, and ensuring systems do not reproduce existing inequalities. - For managers deploying systems, feedback loops and transparency are important to monitor impacts and address unintended consequences. - Major issues in technology ethics currently include data privacy, bias, and the broader societal impacts of technologies like social media platforms. - It can be challenging to incorporate ethics into technology development given business priorities like timelines and budgets, but project

Uploaded by

Rone Santos
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views8 pages

Ethics For Technologists

This article summarizes an interview with Marc Steen about ethics for technologists. Some key points: - Steen wrote a book on ethics for technology practitioners to make ethics more practical and accessible for those developing technologies. - For developers, important ethical issues include data privacy, bias, and ensuring systems do not reproduce existing inequalities. - For managers deploying systems, feedback loops and transparency are important to monitor impacts and address unintended consequences. - Major issues in technology ethics currently include data privacy, bias, and the broader societal impacts of technologies like social media platforms. - It can be challenging to incorporate ethics into technology development given business priorities like timelines and budgets, but project

Uploaded by

Rone Santos
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Research-Technology Management

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/urtm20

Ethics for Technologists


A Conversation with Marc Steen

Marc Steen & Jim Euchner

To cite this article: Marc Steen & Jim Euchner (2023) Ethics for Technologists, Research-
Technology Management, 66:6, 15-20, DOI: 10.1080/08956308.2023.2253121

To link to this article: https://doi.org/10.1080/08956308.2023.2253121

Published online: 30 Oct 2023.

Submit your article to this journal

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=urtm20
Conversations

Ethics for Technologists


A Conversation with Marc Steen
Jim Euchner talks with Marc Steen about how technology developers can incorporate ethical issues into their work.

James Euchner [JE]: You have written a book about ethics doing research, experimenting with new concepts, building
for technology practitioners. It’s a topic of some interest to prototypes.
me. I spent a career developing technology, and I struggled
with how to encourage the design of good technology—what JE: Were there any projects that inspired the book? Were
I referred to as “technology as if people mattered.” As you there any particular situations that created a conflict that you
know, there are many competing challenges for a technology were trying to resolve?
manager: delivering on time, on budget, and with quality MS: There was not a particular project or incident that
are at the forefront. Unfortunately, often there is not a lot of inspired me. I just developed a growing interest in the subject
room for incorporating ethical issues into design—or even over time. Especially in the context of AI, I noticed that our
for discussing them. What motivated you to write about these customers were increasingly interested in ethics and asking
issues? how to do ethics. My interest grew, and I did a PhD mid-ca-
Marc Steen [MS]: As you noted, tech managers have many reer focused on ethics and practical philosophy. The initial
things already on their minds. Even when they would like concerns were with privacy and bias and transparency. I
to, they don’t know how to take into account the ethical think it is obvious that these subjects raise legal, ethical, and
aspects of technology design. Many people I speak with are societal concerns that need to be taken into account. In the
interested, but no one finds it easy to do so. I wrote this book Netherlands, a program on Ethical, Legal, and Societal aspects
to make ethics more accessible, doable, and practical for tech of AI systems (ELSA) has emerged.
leaders.
JE: What are the most serious issues that you see at the
JE: I think that you achieved all of those things. Are you forefront of technology ethics right now?
yourself a developer?
MS: Ethics plays out on multiple levels. Let’s start by distin-
MS: I work in research and innovation more than in direct guishing two levels. One is for those working on an algo-
development. I work at TNO, which is a semi-governmental rithm, doing the development and doing the implementation.
organization in the Netherlands. We have about 3,000 people For these people, there are very practical questions concern-
doing all kinds of research and applied research projects— ing data collection and the privacy of the people from whom
you collect the data. As has been dramatically demonstrated,
the data used by machine learning can also create unin-
Marc Steen works as a senior research scientist at TNO, a leading research tended bias issues. We do not want to reproduce existing
and technology organization in the Netherlands. He earned MSc, PDEng, inequalities and unfairness with our systems, because that’s
and PhD degrees in industrial design engineering at Delft University of what machine learning will do without intervention: it will
Technology. He is an expert in human-centered design, value-sensitive
design, responsible innovation, and applied ethics. He is especially interested take what has happened in history and project that into the
in the ethics involved in designing and applying algorithms and AI systems. future. These are key issues that data scientists or software
His mission is to promote the design and application of technologies in ways engineers need to consider.
that help to create a just society in which people can live well together.
For those in management who want to deploy a system,
[email protected]
the issues are different. A very important approach for man-
Jim Euchner is editor-in-chief of Research-Technology Management and agement for keeping ethics in front of people is to insist on
Honorary Professor at Aston University (UK). He previously held senior
management positions in innovation leadership at Goodyear Tire and Rubber
a feedback loop when the systems are used. A couple of
Company, Pitney Bowes, and Bell Atlantic. He holds BS and MS degrees in months after starting to use an algorithm, you want to step
mechanical and aerospace engineering from Cornell and Princeton back and reconsider its effects in the world. Is the system
Universities, respectively, and an MBA from Southern Methodist University. behaving the way that you wanted it to behave, or has it
He is the author of Lean Startup in Large Organizations. [email protected]
shifted, is it derailing? Transparency is also an important
DOI: 10.1080/08956308.2023.2253121
Copyright © 2023, Innovation Research Interchange. value for managers, so that they can identify when things
Published by Taylor & Francis. All rights reserved. might not be going as planned.

Research-Technology Management • November—December 2023 | 15


like Facebook and Twitter are great arenas to ask these ques-
tions. We can envision the ubiquitous use of these apps and
ask how they might change society. They change how we
talk to each other and the ways we organize our free time.
My education is in industrial design engineering, which
emphasized moving back and forth between problem-setting
and solution-finding. Design thinking enables you to better
grasp the problem and question the implications of the solu-
tion. But how can we make room for asking such questions
in corporate settings? As a technologist, your focus is on
building a system. It’s your job; you’ve got to pay your mort-
gage. So why would you ask such questions?
How do you deal with your own responsibility in such
cases? How much agency do you really have within a structure
of planning, budgeting, or project management? It can be
challenging to make room for such questions in a corporate
environment.
JE: It may seem like that when you are in the thick of
things. But I think that project managers in particular have
more agency. A project manager has a real opportunity to
influence how systems are designed at the ground level—
though people rarely take advantage of this opportunity.
There is often the space to ask, “Is this the right way to
design the system in this case?” You may not be able to
control the whole definition of the project, but you can
Marc Steen is a senior research scientist at TNO, a leading research and
cause conversations to happen, and that’s a valuable thing
technology organization in the Netherlands.
in and of itself.
In your book, you lay out a basis for framing conversa-
JE: There are some ethical polarities or paradoxes, as well. tions. You provide an introduction, in the context of tech-
Systems in the workplace can be disengaging, for example, nology development, to four of the most influential ethical
while social media systems can be over-engaging—they’re traditions. I found your exposition useful and accessible.
almost addictive. Given that these traditions form the basis for your recom-
MS: That’s another important dimension. Privacy in data mendations about how to do good technology, I think it
collection, fairness, justice, bias in the data, and the trans- would be useful to go through the four traditions. Could you
parency or explainability of a system’s results are all issues start with the consequentialist tradition?
that play out on the level of development and deployment.
MS: The first ethical perspective that I discuss is consequen-
But I’d like to address issues like the one you raised by
tialism. This tradition is rather accessible for people with
zooming out and taking the discussion one step meta from
engineering backgrounds because it deals in a sort of calcu-
the development perspective. From a distance, we can ask
lus—pluses and minuses of a proposed action. Let’s take
additional questions. Do we need this technology? Do we
ChatGPT as an example. On the one hand, it can greatly
want this technology? These are questions that can be
increase efficiency, for example, of journalists or program-
addressed using Immanuel Kant’s Categorical Imperative: if
mers; for them, it’s a plus. On the other hand, this may mean
everybody, all the time, everywhere, used this technology,
that other journalists or programmers lose their jobs; for
would it make the world a better place? Social media apps
them, it’s a minus.
I can use ChatGPT to improve my grammar and vocabu-
lary. But on the other hand, it enables those with less benign
motives to generate tons of credible fake news and misinfor-
Many ethical analyses do not mation and flood the Internet with that.
In consequentialist analyses, one can zoom out. ChatGPT
immediately give crisp answers, but is built on data, and those data were acquired somewhere—
they help to raise more precise somewhere between scraped and stolen from the Internet.
Very likely, it violates copyrights. Also, it was built with the
questions for further inquiry and cheap labor of people in Kenya, with little concern for their
deliberation. working conditions or well-being. They were involved in
the tedious job of cleaning the data and flagging content—
sometimes content that is horrible to look at.

16 | Research-Technology Management Conversations


Overall, there are pluses and minuses. That is often the your system being used widely. Does it promote the dignity
case. Many ethical analyses do not immediately give crisp and autonomy of the people who use it, or of the people at
answers, but they help to raise more precise questions, for the receiving end, for example, of the people who are flagged
further inquiry and deliberation. by an algorithm? Especially if they are incorrectly flagged,
I’ve branded the process of using these four different eth- for example, as committing fraud.
ical frameworks Rapid Ethical Deliberation. With a couple of
JE: I’ve been in situations like this. I spent about a third of
people around the table and in two hours, you can begin to
my career developing systems to support operations in a
consider critical ethical issues.
telco, and I have two observations about that context. On
JE: Two hours may not sound like much, but those two the one hand, nobody cares or is even aware of whether or
hours are a place where thinking about ethical issues starts. not a given system is fragmenting or down-skilling work.
Once you start the thinking, it continues. It is critical simply The focus is on economics. But on the other hand, there can
to start. be surprising flexibility to design the systems in a way that
Some people say that there is a certain inevitability about you think best, as long as you deliver the project’s objectives.
any technology. That’s what many people say about ChatGPT, So a project leader has a lot of agency. How do you spread
despite dire predictions about its potential impact. You can the word with that group of people? If you want to have an
think about the consequences of large language models, but impact on the world by bringing these ethical issues to the
there’s probably no stopping their advance. That’s true of fore, you have to influence the guy working in the telco or
ChatGPT, of artificial intelligence in general, of robotics—and the automotive company responsible for systems design.
perhaps of any technology.
MS: You are right. Just anecdotally, I would guess that the US
MS: Possibly. But there is also legislation, which helps to shape has a tradition in which it’s more accepted to focus on short-
the use of technologies. I am from the Netherlands, and the term profits, whereas in Europe there is more consideration of
EU seems to move more quickly in developing legislation workers’ rights, environmental law, and social responsibility.
around new technologies than the US does. An upcoming AI JE: I have noticed that, as well. Let’s talk about the third
law of the European Union will have categories of use that ethical tradition, which you label relational ethics.
are forbidden by law. That, by the way, leads us into the second
ethical tradition I discuss in the book: duty ethics. MS: I use the term relational ethics as an umbrella term. It
Duty ethics deals with duties and rights. On the one includes care ethics and feminist ethics. It is a reaction to the
hand, companies that develop ChatGPT-like applications, or ethos of the European Enlightenment, which purports to be
companies that deploy these, have obligations. On the other rational, abstract, and objective. Relational ethics challenges
hand, the people who use ChatGPT have rights—for exam- such claims or ambitions. It is not always right to be totally
ple, human rights like privacy, dignity, and autonomy, which objective, we need to take into account relationships that
need to be respected. matter in any situation.
There are also other rights—for example, socio-economic Here is an example from the American philosopher Virginia
or cultural rights, like the right to have meaningful work, Held. Imagine a judge in the courtroom; this judge focuses on
which can be corroded by an app like ChatGPT. Evan Selinger justice, but if they have no care for the situation of the offender
and Brett Fried wrote about a Reverse Turing Test. Instead or of the victim, they may not achieve justice. Now imagine
of the normal Turing Test, where the computer imitates a family members living together; they need to take care of each
person, they discussed how, in some applications of technol- other, but they also need to be concerned with justice, for
ogy, people are required to behave increasingly machine-like, example, with equal rights for different people. Justice and
often unconsciously, often unintentionally. care need to go hand-in-hand.
I turn to relational ethics to focus on how technology can
JE: Interesting! Duty ethics gets back to the categorical modify, either for the good or for the bad, how we interact
imperative. Another way of stating that is, “Would you want with each other. You and I are currently in a Zoom call, which
to live in the world you’re creating?” If not, what is your is great. But if all I did were to talk to people via Zoom, it would
duty to prevent that world from happening? corrode human conversation. Relational ethics also questions
MS: Exactly.
JE: If you’re automating, say, an operator services worksta-
tion, and you know that the monitoring and surveilling and
time compression technology permits may make the lives of Relational ethics focuses on how
hundreds or thousands of people just a little more miserable,
technology can modify—either for
what is your duty to do something different?
MS: Yes, indeed. You can consider these duties and rights in
the good or for the bad—how we
terms of obligations, legislation, or human rights. But I think interact with each other.
the point you’re raising also makes sense. We also have duties
regarding human dignity and human autonomy. Imagine

Conversations November—December 2023 | 17


power, power imbalances, and the distribution of power. These JE: In many ways, that is Judeo-Christian ethics, right? It’s
are also very much of concern in the design of technology. based on the cardinal virtues and the good, the true, and the
For example, if AI ends up being owned by five large US beautiful.
companies and five large Chinese companies, that will not
MS: Yes, it is possibly similar to Judeo-Christian thinking.
be desirable because they will have too much power; that
Consider, for example, social media. The business model
will lead to neither care nor justice.
is typically advertising, which means that the goal is to grab
JE: If you look at ChatGPT through the relational lens, what people’s attention, hold their attention as long as possible,
do you see? and monetize that attention. Everything about the app is
constructed in a way that enables this—the appearance, the
MS: Do you remember ELIZA, the Natural Language
affordances, how you use it, the “like” button, the “retweet”
Processing program written by Joseph Weizenbaum? It was
button, etc. This tends to both undermine self-control and
only a couple of hundred lines of code as compared to the
amplify fake news.
billion parameters in ChatGPT, so it was relatively simple,
Imagine a different social media app, one that seeks to
but people believed that it was able to understand and empa-
avoid addiction. That app might ask me how many minutes
thize with them. That effect is even more prevalent today.
a day I want to spend in a given activity. After two minutes
People easily believe ChatGPT’s output. It sounds great, it
it might beep and remind me that my time is up. It would
has good grammar and an excellent vocabulary. It’s always
force me to make a conscious decision before spending more
fluent. But it can be totally untrue.
time. That way, I can cultivate self-control. Before I retweeted
Today, a lot of communication and interaction between
something, it might ask, “Have you read this? Do you want
people is via text. If one side of that conversation is replaced
to read it before you retweet it?” That would help to cultivate
by a lying app, it creates ethical issues that are relational. In
honesty.
his book Computer Power and Human Reason: From Judgment to
Calculation, Joseph Weizenbaum proposes to delegate to com- JE: It can make a big difference if you just ask someone
puters tasks that have to do with computation, because that whether they are sure that they want to send something,
is what they are good at, and not to delegate to computers that it represents who they want to be.
tasks that require value judgments. Leave those to people.
MS: Yes. Another way of saying this is that virtue ethics
This would translate into advice to leave the task of eval-
asks whether a technology enables a user to be a good
uating whether somebody is committing fraud not to a com-
person, to become the best version of themselves. It applies
puter. Many factors and concerns need to be taken into
on two levels. The first is how the technology will be used
account in assessing whether somebody committed fraud,
by people—whether it will help them or hinder them in
and to take appropriate action.
cultivating virtues. The second is at the level of the devel-
Let’s cover the fourth ethical tradition that I discuss in the
oper or project manager. Does it help people in those roles
book: virtue ethics. I’m very much inspired by the work of
to become better versions of themselves, in their profes-
Shannon Vallor. She pioneered the use of virtue ethics in the
sional capacity?
domain of technology, and her argument is that consequen-
tialism and duty ethics don’t work for emerging technologies: JE: Let’s turn now to practice. In your book, you write about
you don’t have clarity yet about the pluses and minuses, and “doing ethics.” You recommended in your book a three-step
there are no clear ethical rules about our duties and respon- process. The first step is reflection and identifying potential
sibilities. She proposes in these circumstances to turn to vir- issues; the second is dialogue and inquiry; and the third is
tue ethics because it encourages thinking about the good life, action and evaluation.
about organizing a society in which people can flourish, and
MS: This is the Rapid Ethical Deliberation that I mentioned
can live well together. It asks what virtues are at stake and
earlier. You bring together your project team, possibly also
whether the emerging technology will help or hinder us in
a client, a supplier or some stakeholder from society. In step
cultivating relevant virtues, like justice, self-control, and
one, they discuss potential issues and identify topics that then
honesty.
need to be concerned with. In step two, they talk about these
issues and topics, using the four ethical perspectives that we
discussed. From experience, I noticed that people need three
or four questions for each of these four perspectives to get
the conversation going. If they are given a structure and a
Virtue ethics asks whether a vocabulary, people can do this. In step three, people articu-
late tentative conclusions and apply these in the project; this
technology enables a user to be a way, they can keep the project moving, and evaluate in prac-
good person, to become the best tice the measures they take.
version of themselves. JE: Let’s take an example. Years ago, I helped develop an
expert system to replicate the expertise of maintenance advi-
sors in a telco. Our system automated someone’s job, but it

18 | Research-Technology Management Conversations


saved the company millions of dollars a year and increased MS: I don’t think that they’re the same thing, but there
customer service. How would you approach something like are common elements. If you’re doing human-centered
that? design, you’re already doing iteration, you’re already prac-
ticing participative design, you already have a deep inter-
MS: That’s a big project, and the ethics get more difficult the
est in people’s activities and their experiences. It is
larger the stakes.
relatively easy to add on top of that a couple of questions
I would start by following the Rapid Ethical Deliberation
about ethics.
process. Raise the issues; discuss them using the four
frameworks; and then act. If you have your concerns on JE: Can you say more about how this plays out in practice?
the table, if you’ve looked at the pluses and minuses and
MS: Well, my background is in human-centered design, so
the other perspectives, and you’ve given it a bit of time
my ideas about how to “do ethics” was informed by the ideas
to make sure you didn’t miss anything, then it is import-
of iteration and participation that are inherent in human-cen-
ant to act. Remember that this is iterative, so you can
tered design. The method arose from my experience with
assess your intervention in the system and pivot if
human-centered design.
necessary.
There is actually one step before the three steps. First, I
Of course, if the project manager thinks their focus is only
ask the people involved to describe the system they work on
on budget and timely delivery, then there is little room for
in very practical terms: how will people use it, whom will it
such reflection.
affect? A fraud detection system will be used by a civil ser-
JE: It’s hard to find the language to introduce this. I have vant. Citizens may encounter the system’s outputs. And what
found, for example, that developing systems using sociotech- happens when something goes awry? Typically, one person
nical principles results in better systems. You actually get from the project team prepares a short presentation.
more efficiency, happier employees, and less turnover. It’s Sometimes, they will say that they don’t know yet because
more cost effective. But most people don’t believe this is true the system hasn’t yet been developed.
because they have a mental model about how people will From the perspective of human-centered design, that’s
behave in the world and how technology behaves in the not a problem. In fact, it’s a benefit. They can use what they
world that is not realistic. It doesn’t deal with the realities learn in further development. They can make a drawing, a
of the way work happens in the real world. But I haven’t sketch. As they used to say at design agency IDEO: “Never
found the language or the economic framework that breaks Go to a Meeting Without a Prototype.” So, to make the three
through the current mindset. steps productive, we start with putting something on the
table to discuss.
MS: I totally agree, but there’s a continuum. Interest is grow-
ing, and I have a hope that it will become increasingly normal JE: Very good point. The second approach that you discuss,
to take ethical and societal aspects into account when build- after human-centered design, is value-sensitive design.
ing something new. Things like General Data Protection People come at the world with different sets of values. How
Regulation, or GDPR, which was introduced to protect peo- do you do value-sensitive design when people sitting at the
ple’s privacy, may legitimize a broader discussion and make table may come from different cultural heritages, political
some of these issues more discussable. orientations, professions, and so on? Some will value growth
over fairness. Some will value inclusion over rigor. Their
JE: Let’s turn to the checklists at the end of your book. The
values will be different. How do you manage that?
questions on the checklist are the way that you would facil-
itate Rapid Ethical Deliberation. Is that right? Can a team do MS: It’s a great question. On the one hand, it can be hard to
this by itself, or is it helpful to have someone facilitating find common ground if people are different; on the other
who’s asking the questions, listening, and directing? hand, it would cause all sorts of problems if they were all the
same.
MS: My experience gives me real hope and reassurance
I’d like to say two things about value-sensitive design.
that you don’t need a lot of training to do this, and you
One is that people who have heard about it often assume it’s
certainly don’t need to be a philosopher. What you need is
about value tensions, tensions between opposing values. And
people with good intentions and a bit of time. If the project
it can be. But we can engage with values differently than in
manager is on board and makes this part of the work, people
terms of opposites. Take cameras in public places. They’re
can do this.
meant to promote security, but they can also infringe upon
JE: That’s the crux: the perspective and courage of the project people’s privacy. There is a tension between security and
manager. privacy. This may lead one to believe that they are opposites.
I want to make sure that we cover some of the specific We can either have security or privacy.
methods that you recommend to practitioners. The first is But creativity can resolve such a tension. We can turn to
human-centered design, which makes a whole lot of sense data minimization, for example. You can have rules for how
for a variety of reasons beyond ethics. What’s different about to store the data and who can look at the data. You can blur
human-centered design with this ethical lens and human-cen- people’s faces. We can find creative ways to combine seem-
tered design as it’s normally practiced? ingly conflicting values.

Conversations November—December 2023 | 19


There are methods in creative problem-solving and in technology—Delft, Eindhoven, Twente, Wageningen—
group facilitation to create safe spaces, encourage minority include ethics in their curriculum. They use, for example,
voices, etc. the book by Ibo van de Poel and Lambèr Royakkers.
There are also communities in the US, like All Tech is
JE: I experienced this in developing systems for secure voting
Human, and professional societies like IEEE and ACM, the
by mail. One group values access to voting and another values
Association for Computing Machinery.
security. How do we reconcile those two? If you don’t go to
the level of values, then everybody’s just at each other’s throats, JE: More work on the ground is certainly needed. The
assuming the worst about the other person’s motives. But if world is spinning very fast, and it’s hard for technologists
you step back, you can have a constructive conversation. to keep up with the new issues that are created. I think
your book provides a grounding that will be helpful
MS: That’s also a technique in conflict resolution. You start by
to many.
finding the level at which the two parties might agree and go
from there. Both those in favor of vaccinations for COVID-19
and those opposed to vaccinations value health, for example. Disclosure statement
That concern for health is where the conversation can start. No potential conflict of interest was reported by the author(s).
Putting diverse people around the table, having the topic
in the middle of the table, and having the people talk about
References
the topic is valuable in any case—but it’s not easy. There are All Tech is Human. 2023. https://alltechishuman.org
issues of power, issues of exclusion and inclusion—who’s Held, V. 2005. The Ethics of Care: Personal, Political, and Global.
invited, who’s excluded intentionally, who’s excluded unin- Oxford: Oxford University Press.
tentionally. There are also issues about how different people Kelley, T., and Kelley D. 2013. Why Designers Should Never
voice their concerns. Is this a place where we are respectful Go to a Meeting Without a Prototype. Slate (https://slate.com/
or does the conversation go quickly into conflict? human-interest/2013/10/the-importance-of-prototyping-
This is one of the hardest bits in doing ethics. I’ve done creative-confidence-by-tom-and-david-kelley.html)
sessions like this with people who already trusted each other, Selinger, E., and Frischmann, B. 2018. Re-Engineering Humanity.
but it can be difficult if people from different organizations Cambridge: Cambridge University Press.
Steen, M. 2022. Ethics for People Who Work in Tech. Boca Raton,
or with different interests are around the table.
FL: CRC Press.
JE: Are there communities of engineers and computer sci- Vallor, S. 2016. Technology and the Virtues: A Philosophical Guide
entists who are thinking about this and trying to put it into to a Future Worth Wanting. Oxford: Oxford University Press.
practice? Van de Poel, I., and Royakkers, L. 2011. Ethics, Technology, and
Engineering: An Introduction. Chichester, UK: Wiley.
MS: It’s different in different places. I am happy to be Weizenbaum, J. 1976. Computer Power and Human Reason: From
living and working in the Netherlands. Our universities of Judgment to Calculation. New York: W H Freeman & Co.

20 | Research-Technology Management Conversations

You might also like