TRIAL - TWIML (Artificial Intelligence) Interview With Abeba Birhane - 202309181109
TRIAL - TWIML (Artificial Intelligence) Interview With Abeba Birhane - 202309181109
2
00:00:12,360 --> 00:00:14,740
Thank you so much for having me, Sam.
3
00:00:15,400 --> 00:01:13,140
I'm really excited about this conversation. We had an opportunity to meet in person
after a long while interacting on Twitter at the most recent Neurops conference, in
particular the Black and AI workshop, where you not only presented your paper
Algorithmic Injustices Toward a Relational Ethics, but you won best paper there and
so I'm looking forward to digging into that and some other topics, but before we do
that I would love to hear you kind of share a little bit about your background and
I will mention for folks that are hearing the sirens in the background. While I
mentioned that you are from University College Dublin. You happen to be in New York
now at the AI ES conference in association with Triple AI and as folks might know,
it's hard to avoid
4
00:01:13,440 --> 00:01:24,780
sirens and construction in New York City. So just consider that background or mood
mood ambiance background sounds so you're background.
5
00:01:24,880 --> 00:01:25,440
Yes,
6
00:01:25,440 --> 00:01:28,300
yes, how did you get started working in AI ethics?
7
00:01:28,440 --> 00:02:27,740
So my background is uh in cognitive science and particularly um, a part of
cognitive science called embodied cognitive science which is which has the roots
you know uh in within cybernetics in systems thinking. The idea is to focus on on
on the on the social on the cultural on the historical and kind of to view
combination in continuity with the world with with historical backgrounds and all
that's in our supposed to, you know your your traditional approach to cognition,
which just rates cognition as something located in the brain or something
formalizable, something that can be computed. So yeah, so that's my background,
even during my masters I lean towards, you know, the AI side of cognitive science,
8
00:02:28,360 --> 00:02:49,180
the more I Dave into it, the more I much more attracted to the attic site to you
know injustices to the social issues, and so the more the PhD goes on, the more I
find myself in the basics site.
9
00:02:49,800 --> 00:02:58,740
Was there a particular point that you realized that you were really excited about
the ethics part in particular? Or did it just evolve for you?
10
00:02:59,160 --> 00:03:58,960
I think it just evolved. So when I started out I at the end of my masters and at
the start of the PhD, my idea is that you know we have this new, relatively new
school, a thing way of thinking which is embodied cookside, which I quite like very
much because it emphasizes, you know, ambiguities and messiness and contingencies
are supposed to, you know, drawing clean boundaries and so the idea is, yes, I like
the idea of redefining cognition as something a relational, something inherently
social and something that is continually impacted and influenced by other people
and the technologies we use. So the technology aspect. The technology and was my
interest, so initially the idea is, yes,
11
00:03:58,960 --> 00:04:15,780
technology is constitutes aspect of aspect of our cognition. You have the the
famous 1998 thesis by Andy Clark and the Charmers. They extended mind where they
claimed you know the iPhone is
12
00:04:18,279 --> 00:05:15,420
so you can think of it that way. And I was kind of advancing the same line of
toads, but the more I dalt into it the more I saw yes, digital technology, whether
it's, you know, ubiquitous computing such as face recognition systems on the street
or your phone, whatever? Yes, it does impact and it does continually shape and
reshape our cognition in what it means to exist in the world. But what became more
and more clear to me is that not everybody is impacted equally. The more privileged
you are. The the the more in control of you are as to you know what can influence
you and and what you can avoid, so that's where I become
13
00:05:15,680 --> 00:05:23,140
more and more involved with the attic self competition and its impact on
combination.
14
00:05:23,440 --> 00:05:45,060
The notion of privilege is something that flows throughout the work that you
presented at Black and AI. The Algorithmic Injustices paper and this idea, this
construct of relational ethics. What is relational ethics and what are you getting
at with it
15
00:05:45,120 --> 00:06:44,760
yeah, so relational ethics is actually not a new thing. A lot of people have
terrorized about it and have written about it but the the way. I'm approaching it
the way I'm using it is. I guess it kind of springs from this frustration that for
many folk who talk about aitics or fairness or justice, most of it comes down to,
you know, constructing this needs formulation of fairness or mathematical
calculation of who you should be included and who should be excluded, what kind of
data do we need that sort of stuff? So for me, relational attics is kind of let's
let's leave that for a little bit and let's zoom out and see the bigger picture.
And instead of using technology to
16
00:06:44,760 --> 00:07:43,500
solve the problems that emerge from technology itself. So which which which means
center in technology, let's instead center the people that are people, specially
people that are disproportionally impacted by, you know the the limitations or the
problems that arise with the development and implementation of technology. So there
is a robust research in you can call it AI Fairness or our gorismic Injustice, and
the pattern is that the more you are at the at the bottom of the intersection, a
level that means the further away from you are from you know, your stereotypical
white systendered male, the more that the bigger the negative impacts are on you,
whether it's classification or categorization, or whether it's being
17
00:07:43,680 --> 00:08:15,358
scaled and scored for by hiring algorithms or looking for housing or anything like
that. The the more you move away from that stereotypical category, you know the
status cool. The more the heavy the impact is on you, so the idea of relational
ethics is kind of to to to think from that perspective to to take that as a
starting point. So these are the groups or these are the