Redefining Society Podcast

Book | Ethics for People Who Work in Tech | A Conversation with Author Marc Steen | Redefining Society with Marco Ciappelli

Episode Summary

Welcome to "Redefining Society Podcast," where today we're exploring the intersection of ethics and technology with Marc Steen, author of "Ethics for People Who Work in Tech.”

Episode Notes

Guest: Marc Steen, Senior Research Scientist at TNO [@TNO_nieuws]

On LinkedIn | https://www.linkedin.com/in/marcsteen/

On Twitter | https://twitter.com/marcsteen

On Mastodon | https://mastodon.social/@marcsteen

Website | https://marcsteen.nl/index.html

____________________________

Host: Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli
_____________________________

This Episode’s Sponsors

BlackCloak 👉 https://itspm.ag/itspbcweb

Bugcrowd 👉 https://itspm.ag/itspbgcweb

Devo 👉 https://itspm.ag/itspdvweb

_____________________________

Episode Introduction

Welcome to another episode of "Redefining Society Podcast," where we muse on society and technology at their intersecting point, exploring how one influences and changes the other. Today, we are diving into a topic that is not just crucial but also ethically challenging—how those who work in the tech industry can and should approach ethics in their daily practice.

Our guide in this intellectual journey is Marc Steen, a senior research scientist at TNO in The Netherlands. Marc is an expert in Human-Centred Design and Value-Sensitive Design, with a strong focus on the responsible innovation and applied ethics of technology. He asks the questions that many of us perhaps hesitate to ask—especially when it comes to the ethics behind algorithms and AI systems. With an extensive background in both scholarly and popular writing, Marc aims to push the boundaries of ethical discourse in technology.

Now, why is this important? As we engage more and more with technology, its repercussions echo louder and louder through the hallways of our societal norms, affecting our collective values, freedoms, and even democracy itself. So, when we talk about tech, we're not just talking about a tool or a service; we're discussing a societal force that shapes our daily lives.

Marc's recent book, "Ethics for People Who Work in Tech," serves as a roadmap for professionals in the tech industry. Whether you are a computer scientist, a software developer, or even someone involved in policy-making, this book aims to empower you to think ethically. It provides a practical, three-step iterative approach for integrating ethics into projects and uses four distinct ethical perspectives to evaluate outcomes. The ultimate goal? To design and use technologies that contribute to a just society—a society where people can truly live well together.

Today, we are going to explore these ideas deeply. We'll question the uneasy aspects of technology that most shy away from. We'll explore how we can achieve a state where innovation isn't just about creating the new and the powerful, but also about ensuring fairness, freedom, and communal well-being.

So sit back, and let's embark on this philosophical inquiry into the ethics of tech. Because, if technology is shaping our society, it's our moral imperative to ensure that it shapes it for the better. Stay with us.

_____________________________

About the Book

This book is for people who work in the tech industry—computer and data scientists, software developers and engineers, designers, and people in business, marketing or management roles. It is also for people who are involved in the procurement and deployment of advanced applications, algorithms, and AI systems, and in policy making. Together, they create the digital products, services, and systems that shape our societies and daily lives. The book’s aim is to empower people to take responsibility, to ‘upgrade’ their skills for ethical reflection, inquiry, and deliberation. It introduces ethics in an accessible manner with practical examples, outlines of different ethical traditions, and practice-oriented methods

_____________________________

Resources

Ethics for People Who Work in Tech (Book): https://www.routledge.com/Ethics-for-People-Who-Work-in-Tech/Steen/p/book/9780367542436

Ethics for People Who Work in Tech (Website that accompanies book): https://ethicsforpeoplewhoworkintech.com/

Ethics as a Participatory and Iterative Process: https://cacm.acm.org/magazines/2023/5/272289-ethics-as-a-participatory-and-iterative-process/fulltext

____________________________

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

Watch the webcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllTUoWMGGQHlGVZA575VtGr9

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/advertise-on-itspmagazine-podcast

Episode Transcription

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording as errors may exist. At this time we provide it “as it is” and we hope it can be useful for our audience.

_________________________________________

[00:00:00] Marco Ciappelli: Hello, this is Marco Ciappelli. Welcome to another episode of Redefining Society podcast. And this is where, as you know, we muse and reflect on technology and the way that we interact with it. So it's either you can think about society and technology or technology and society. I don't care. It's the same thing we meet in the middle, but I think that if I have to start with one side, I will start with society. 
 

I'll give it a little bit more of the relevance when we have this conversation. And, uh, and eventually, you know, we meet in the middle today. I'm actually really excited because we're going to talk about a book called ethics for people who work in tech. And we're going to do that with the author, uh, Marc Steen, uh, all the way from Europe. 
 

And, uh, I'm All the way in LA. So, uh, this is just an audio podcast. So in order to prove that Marc is real and is here with me, we're going to, we're going to have him introduce himself and say hello to everybody. And I'm sure it's going to be a fantastic conversation in line with Redefining Society. 
 

Welcome Marc.  
 

[00:01:12] Marc Steen: Thanks for the invitation, Marco. Yeah, my name is Marc Steen. I live and work in the Netherlands. I work as a Senior Research Scientist at TNO, which is a large, independent, government funded research and technology organization in the Netherlands. And for the last... My 10 years or so I've increasingly specialized in asking questions about ethics involved in the creation, the development and the deployment of technologies with a focus on digital technologies, big data algorithms and. What they currently tend to call Artificial Intelligence, AI.  
 

[00:01:58] Marco Ciappelli: Do we agree on the intelligence? We want to start with that.  
 

[00:02:02] Marc Steen: No, yeah, it's, it's, yeah. Non verbally, I was hinting like it's a silly term. I would rather say machines doing, yeah, calculations. And then we can use them as tools. And then they can do useful stuff or less useful stuff for us. 
 

So I regard technology always as tools. Um, and even more. I look at them often from a perspective of ethics where you can think of technology as tools that can either help people to live well together or hinder people to live well together. And it's not so black and white as that, but it's, it enables a conversation on looking at technology always as tools potentially for, uh, Announcing human capabilities. 
 

And yeah, AI as a white plastic robot that does things and the things of its own. No, no. Yeah. It's nice for science fiction movies, but it's not very realistic.  
 

[00:02:59] Marco Ciappelli: Okay, we're now gonna hear that, uh, you know, the voice, the metallic voice in the back telling that Dave, I can't do that. Um, so I, I like to start with the idea of, uh, intelligence because I feel like you set up this. 
 

The expectation is so high when you talk about intelligence. I think a lot of people don't even know how to define intelligence in general when it applies to to humans and or animals, right? You know, are they intelligent? Are they intelligent in a different way? So, you know to to attribute that to a machine. 
 

Yeah, great But it seems a little a little too much in terms of expecting and then it's easy to say well Not really that intelligent. So, uh, I would like to know, first of all, let's start with Why did you feel the need to write this book? Um, I'm assuming based on your experience in working in technology for a  
 

[00:03:57] Marc Steen: long time. 
 

Yeah, exactly I've worked for 25 years by now in various roles uh first as designer As a researcher, as a project manager, always in, as I mentioned, like digital technologies, virtual communities, applications, uh, algorithms, uh, really a host of, uh, very various, uh, projects. And, um, increasingly I was focusing on ethical aspects. 
 

You can think of privacy, but obviously also other topics like fairness or human autonomy. And then, um, my co workers and partners and clients were often then when they knew I, that I was like the ethics person or an ethics person. Hey Marc, can you help us with this project? Can you see whether it is ethical or not? 
 

And then quickly I found out that's, that's not a very fruitful path for me, like me judging them whether they're ethical or not. It's a, it's a silly task. I don't do that. So rather I presented ethics as a process. that they go through either implicitly, but rather I would prefer explicitly. So I help them with a vocabulary, a framework, uh, uh, for taking into account ethical aspects in development and employment of technologies. 
 

So the, the, the need was really there. Uh, many, many professionals, if, if I give lectures or talks, I that ethics. 
 

is important in your projects. Well, all the, all the hands go up. And a second question, who of you thinks that, uh, it is easy to take into account to integrate ethics in your project where all the hands go down. So yeah, this is, this is like the, the gap where my book steps in. There are other books, uh, more academic ones, academic ones. 
 

Mine, I think, and this also from the reviews, um, is rather accessible. I tried to explain four Uh, major ethical perspectives in a, in a practical manner. Lots of examples. Uh, yeah. So yeah, it's, it's helpful for people who work in tech who want to, but don't know how exactly to, it's very practical book.  
 

[00:06:17] Marco Ciappelli: Yeah, and I wanna go definitely as we keep the conversation going into some examples, the, the steps, you know, the tips that you give in order to follow this method. 
 

But I want to hold a little bit into this process of people that work in tech. They, they want to be ethical, but most of the time they don't know how to apply. And I'm not saying. That they're not ethical people. I mean, I'm hoping we're all somehow and somewhat ethical. But do you think that what they call, you know, the liberal art or the social science, like, sociology, philosophy, political science, and so forth, should be maybe at this time implemented into the education of technologists or people that work in tech so that maybe they do have some base fundamental to to go do their job in this world where Technology is not just technology in a silo  
 

[00:07:27] Marc Steen: Yeah. Yeah, I would certainly recommend uh People who are educated to become a computer scientist, data scientist, software engineers, programmers, project managers, this whole group of people who helped to create this digital world, digital applications. 
 

If they, if they learn more about, uh, ethics and also other topics, you mentioned like sociology, uh, even political science, I mean, it's helpful. If, if, if you have just a broader view on it, and then, as I mentioned, I live in the Netherlands and we have four, uh, large, uh, universities of technology. That's Delft, Eindhoven, Twente and Wageningen. 
 

Uh, and all of them have in their second year a, uh, a course that is obligatory for all, uh, applied ethics. So from where I come from, it's normal to have that as part of the education, but I've heard that it's not so in all other countries.  
 

[00:08:28] Marco Ciappelli: Yeah, that's why I asked you because I'm sure different countries, they probably prepare their, their academic curriculum in a different way. 
 

But I feel also that nowadays with all these changes, I think that many education. They are, at least I hope, they are adapting to it. And, um, you know, I mean, it's definitely needed. I always joke that nowadays people that are, that have studied philosophy, that they, In the past, even when I graduated many years ago, it was kind of like, why do you study that? 
 

Right. And, and I would be like, uh, yeah, well, maybe I'll go teaching them. Maybe I'll be using it in communicating with people, advertising like I did, but nobody thought like, oh, I'm going to go and use it in computer science. And now, now, now we do.  
 

[00:09:23] Marc Steen: Yeah. It's increasingly necessary. The technologies like social media are so. 
 

ubiquitous and they have such enormous impacts. They help to, uh, no, help. They can enable people to skew and, and rig elections. Uh, so yeah, these implications are, are huge and, and, and real. So social science is needed to understand those processes. Ethics is needed, political science. Um, Yeah. 
 

And I think that there's an increasingly, and also to come back to your question, I'm not aware of all the other countries and their curricula and their universities of technologies, but I would guess that Netherlands is not alone in this. There must be many, many other countries increasingly implementing ethics in their like core courses. 
 

[00:10:11] Marco Ciappelli: Right. But let's talk about your book. So you say you have many examples, you have a system that would, I am sure it will help both people that do have a certain background in. In, in social science, because, you know, it's never enough, certainly not a, a science that is, uh, exact, uh, like maybe other kind of science and, uh, and how did you approach that? 
 

So, how do you introduce even the topic of ethics to people?  
 

[00:10:41] Marc Steen: Yeah, that's, that's, that's a nice question. I do two things. The first thing is, uh, I describe ethics and I present ethics as a, as a process. So the, the anecdote that I just told, like, Hey Marc, can you help us make our project more ethical? 
 

That's like, you see, uh, ethics also maybe as a, as a checkbox or as a checklist or as a rubber stamp that you need to proceed or as a roadblock that you must like, uh, Uh, cross and then the innovation can continue. And instead of that, I, I present a metaphor of the steering wheel. So imagine your project as a vehicle. 
 

And then you want your vehicle, your project to go safely from A to B and not hit the side of the road, not hit trees, stay on your side of the road, no accidents, take the correct, uh, off ramp. And that's, that's, that's how you can use ethics as a steering wheel to, to keep your project, um, Yeah, on track and not have it derail or have a collision or crashes. 
 

Um, and secondly, um, and that's a bit more abstract. It's not a real metaphor, but as a process, which means, uh, this also has to do with my background in industrial design engineering in Delft, which approaches design. And innovation, very much like a process that you can facilitate to some extent, manage it, but coordinate and facilitate for sure. 
 

And then it's an iterative process that I'm proposing. Like first, you put your project like on the table, you look at it, you imagine, you envision the outputs, what you want to achieve in the world. And then you identify issues. That's step one. What could go wrong or what could go. horribly well. I mean, that's also, uh, something I gave the example of social media. 
 

Now, everybody all the time, everybody all the time, everywhere is using social media. So it's like, yeah, that can also go wrong or have implications. So that's step one, identify issues or risks or things that you want to look at. Secondly, organize, uh, conversations, dialogues about these first with a project team, but also with a potential, uh, user or somebody you who will experience the effects of it. 
 

If it is an algorithm that aims to detect fraud, you can invite a person who has been pointed at as being, uh, committing fraud. But then, for example, it was a false positive. So that person, in fact, didn't commit fraud. So that's very useful to, to hear some such a person's experiences with. So were you able to complain? 
 

Were you able to correct? Were you able to look into the algorithm? How it, how it came to this wrong conclusion? Thirdly, and that's also an important step, is, um, Uh, and that's the opposite of the philosopher sitting in their armchair. It's do something, keep your project going, uh, make a decision, either design option A or design option B or this feature or that feature, and then importantly, uh, do some kind of experiment or monitoring that enables you to evaluate, so issues, conversations, and action that's, I think is a, is a iterative process that you can repeat every three months, three weeks, depending on your. 
 

Oh, the way that you organize your projects, right?  
 

[00:14:11] Marco Ciappelli: I like the idea of keeping it alive because things change, right? I mean, it's not something that, you know, something you may think it was ethical 20, 30, 50 years ago. It may certainly not be considered ethical today. Society evolve. Change sometimes go a little bit backward. 
 

You know, we see that sometimes it keeps moving forward and and I think that what we really need to pay attention and I like to use this word when we have this conversation is the unexpected to. consequences, right? Because it, you, you were using the example as like, Marc, can, can you come here and tell me if this is ethical? 
 

And you're like, well, that, that's a big question to put on one person responsibility. And, and it's more of a collective responsibility, I think, and, and, and keep the mind open and say, look we do have biases. We, do think in a certain way. And what maybe Ethical for you, maybe in another culture, maybe unethical. 
 

So ethics, it's even the definition of it. It's not an easy one. So to keep it open like this, I think it's a great concept, but you said something before we start recording that I really like the other approach of zooming in and zooming out. So I would love for you to, to talk about that approach.  
 

[00:15:36] Marc Steen: Yeah, it's, um, uh, Systems thinking is underlying the book that I wrote. 
 

Um, yeah, there, there are various ways to do systems thinking. The way that I do it is, is a rather practical. So if I find myself in a, in a meeting, they've invited me to think with them or I'm, I'm, I'm a project team member. And at the moment that all of us, all of them, all of us are talking about some detail in the algorithm and the user interface in the product. 
 

I. I can then ask a question that zooms out, like, so, let's imagine this algorithm being implemented, having effects in the real world, uh, societal effects, uh, effects on, on, on this example of this fraud detection on a person's, uh, uh, a daily life. So, so then I zoom out. If everybody zoomed in, I zoom out. 
 

But I can also do this the other way around. Uh, imagine there's same project, but a while later, a different meeting we're talking about, or they're talking about fairness and equality in rather abstract terms. I will then ask a question that says, Hey, but what was this user interface again? Uh, this operator Uh, that gets a list of names and there's a red flag for this person's name because the algorithm thinks that person was fraudulent, has then this operator the ability to, to, to push back the algorithm to, to not follow, uh, uh, well, the orders of the algorithm, so to say, to not follow up, but to question it, to push back. 
 

So then everybody zoomed out and I zoom in and I think that's a nice way of, of showing the, the connections systems thinking you may say, between what you create on a micro scale. And the macro effects and the other way around how macro effects hang together with seemingly details, but that can, for example, fairness, justice, equality, bias, this is like a term that often people use, but, um, like you were saying, well, it's a, it's a weird analogy. 
 

You were saying like, values are not static. Uh, they change. Yeah, sure. Uh, and also if people talk about fairness or bias or quality. They may think of it as a snapshot in time, but it's also, there's always a time dimension. So what happens time wise, if I get this, uh, A person calling me up, inquiring whether I did something fraudulent. 
 

How can I then, time wise, react to that? Do I have the ability to respond, to correct, to question the algorithm? So that's procedural justice, 
 

not only the material justice. And yeah, that's something that I often do, to imagine how things happen over time. Uh, whether people have actual effective abilities to correct, to push back if the algorithm doesn't go well. Yeah, I'm just realizing that I often give examples of when the algorithm fails because, well, these are more interesting examples than, uh, If the algorithm just does its job correctly. 
 

[00:18:50] Marco Ciappelli: So I want to run by you something like we started with artificial intelligence and the word intelligence. Also artificial is kind of funny for me, but let's not go there. But intelligence and generative AI. Uh, they, it's, it's made by ingesting, harvesting all this information. And, you know, there are ethics in that process as well, from copyright to many other things, but, but in general, the way I think about it is AI, it's ultimately human technology. 
 

It's human algorithm is human because it's not just technology that comes from nowhere. It's, it's. It's the consequence of our thinking and I want to be positive where only a few people maybe wake up in the morning and say, I'm going to be evil. I'm going to do something really bad today. Um, sure. Maybe there are the villains in the, in the world, but in general, it's kind of like by mistake by not thinking because it's, it's hard to think from zooming out, zooming in. 
 

Put it yourself in a certain, in a perspective, put it in another. So I think a system it's necessary. How solid is that system? I don't know. I'm just throwing this.  
 

[00:20:21] Marc Steen: Oh, there. Let me pick up on your, on your proposition that most of us are not evil. I totally agree with that. Like in the, in the first chapter or so of the book, I write the assumption underlying the book is that most of us, all of us have the ability, even the motivations to do good, to do well, to do good to others, to do good to the world. 
 

And then, um, I've had a reviews and, uh, uh, responses to my book where people. And I think they had a good, good, good, good argument. Like, uh, suppose that I read your book, I understood most of it. And now I have a vocabulary and I'll have a framework and I'll have four perspectives. I can like do ethics in this process, manner, like identify issues, talk about them with appropriate people and then action and duration. 
 

But then there's also the context in which, in which I work. If I work for a, uh, uh, stock owned a company, then of course the, the, the short term quarterly financial results are, are like the main concern. 
 

And then you will have also typically again, the example of social media. a business model based on grabbing people's attention, holding people's attention as long as possible. Uh, a bit of polarization, a bit of, uh, yeah, fueling arguments, having people engaged as they call it, and then selling advertisements. 
 

Um, if that is the context in which I earned my living, in which I earned my salary, then it's more difficult to, um, to bring into practice this, this, this better nature of myself and this ethical impulse of myself. So in sociology, I think there's this typical separation, which is not a real separation, but like agency and structure. 
 

So my book is much about agency to equip the moral agent, the individual, the software developer, the engineer, the project manager, manager to integrate ethics into the project, but it's also structure. Uh, the company in which you work, the way that projects are managed, the way, uh, you're expected to deliver results that help to, uh, Creates short term financial benefits, for example. 
 

And, um, in the book, there's four, in the middle of the book are four chapters with four main ethical perspectives, and each of these chapters starts with a bit of, um, fiction, which I had, uh, lots of fun writing it. So, like a theatre play or a film script. Four people in a room, typically, doing a project management. 
 

Uh, review meeting, something, and then it's talking about deadlines. It's talking about quality. It's talking about, uh, different functions, the legal person, the interface, per user interface person, uh, the client. And I think by that way, I try to, yeah, pull in some of the reality of doing your work in a project, in a company, and then also showing how sometimes. 
 

Hopefully often this inclination to do good, uh, stays alive. 
 

[00:23:42] Marco Ciappelli: Yeah, it's, it's tough. I mean, I'm, I'm thinking the example you brought, like when you are in an environment where it's the same thing of the definition of better, like when some people say, well, I want to do better. I'm like, better than what, what's your. What's your parameter? What are you confronting? What are you comparing better with? 
 

There is not a better in general. Better, again, if your goal is to, as a company, is to make money, sell more things, sell advertising, that could be your better. But from a societal perspective, yes, sure, you create jobs, you create economy, you create, sell. Are you really making society better? So I know that's one of your, of your goal to kind of like, look at technology as a tool that can help humanity to, to be more fair. 
 

And  
 

[00:24:41] Marc Steen: yeah, yeah, yeah. And I like very much the way that you put it. Indeed, like even a simpler, like better, better in what sense, better for whom, I have a chapter in the book that deals with value. And then I play with value in the sense of economic value. basis model wise, but also a value in the sense of, uh, helping people to live, uh, to live the good life in terms of well being. 
 

And I also noticed that increasingly governments are, are going like beyond GDP. So your gross national gross domestic product as a measure for wellbeing is, is like, yeah, it's not, it's not, it's, it's, it's not as relevant anymore because now we need to pay attention to the limits to growth. Now we need to pay attention to sustainable growth or sustainable development. 
 

And then. Yeah. So it's, it's, it's a way indeed, like you're saying, better create value, create value, economic value or wellbeing value or what sense and for whom is always a good questions to, to ask.  
 

[00:25:48] Marco Ciappelli: Yeah. And I, and I think that while when technology, maybe I like always to look back and then. You know, to look forward. 
 

And I think like when technology wasn't, at least the computer technology, the digital technology wasn't so pervasive, it didn't really worry too much. Because I don't know, maybe it was more of an entertainment. the technology or something that you use in a specific environment like manufacture. Or maybe I'm thinking, you know, I can think technology as a cassette tape or, you know, you're from the Netherlands. 
 

So I know it was invented there, uh, by the radio or, you know, that that was technology, but it wasn't so Like what kind of arm could have done the moment that we put that on the internet and then it become, everybody's phone, social media. I mean that the game became a lot more complex to play. 
 

[00:26:49] Marc Steen: Yeah, totally agree. Uh, historians of technology will. Possibly be able to argue like the radio was big. It helped to spread the news. It enabled people to mobilize people, uh for good and for bad and television was big because it was I think an An increase in hours spent, uh, per day at a screen. 
 

Yeah, but definitely, uh, the internet as well, uh, and all things digital or all things online, it's like, um, it's nice sometimes to have a historical perspective, like who would have guessed that we're now all carrying. All the time in our pockets, in our hands, often with our necks curled to watch the screen, even on the bike. 
 

And from the Netherlands, of course, it's, it's, uh, in a sense, it's crazy. We have this powerful tools and what are we doing with it? Strolling through Facebook? Really? Explain that to me. Why?  
 

[00:27:55] Marco Ciappelli: That's my question too, like you have such a huge power there and, and we're using so little, uh, if, if any, and it's more of an entertainment and then a power. 
 

I mean, look at. Again, you know, generative of AI. I mean, I, I, I like to play and I use, uh, CHAT GPT. I use Mid Journey to create images. I mean, it's liberating from a creative perspective. Um, I understand it comes with consequences, but, uh, you know, experimenting. With it and see if you do a better job, you get inspired. 
 

That's, that's one thing that it will. So what is your thoughts on the future? Let's get there before we end. I'm curious, what's his vision after you wrote this book? 
 

[00:28:44] Marc Steen: My vision for the future is that people involved in the creation and deployment of technologies become better in integrating ethics, which will hopefully in the projects, which will then hopefully lead to. Uh, products and services and apps that better enable people to, uh, live well together. And that's allusion to the book. 
 

I plan to write next. Uh huh. All right. Yeah. Yeah.  
 

[00:29:15] Marco Ciappelli: Yeah. Yeah. Give us a, give us a preview.  
 

[00:29:19] Marc Steen: That was, that was the preview. Yeah. No, what I can maybe do is, uh, even briefly, uh, discuss, uh, the four ethical perspectives that I like, like at the center of the book. And then I can answer this question more. Um, yeah. 
 

So I go back to. Uh, several ethical perspectives, uh, that are commonly used also in the context of technology. Um, uh, first one, I think it's very much accessible to people with technology or with economic, economic, economics backgrounds is consequentialism. It simply looks. Seemingly simply looks at, uh, the potential pluses and minuses of, of the technology that you're working on. 
 

So let's take jet Chat GTP as an example on the plus side, it helps people with, uh, creating better text, better vocabulary, better grammar. I can become a better writer, more creative writer if I use it correctly, like not slavishly, but like as a tool on the downside, maybe people lose their jobs, uh, because yeah, you need less. 
 

Workers, if they, if they use, if they can use a powerful tool like CHAT GPT. Consequentialism also enables you to look at the pluses and minuses, how they are. Uh, in a larger system, like the creation of CHAT GPT required lots of workers in Kenya to clean up texts, to moderate, to do this and that. 
 

And that's, that was obviously, uh, uh, not always pleasant work, labor conditions not optimal, et cetera. So lots of the costs are done. Yeah, transferred to a country like Kenya to the workers over there. So that's also How large you take your pluses and minuses analysis and then how the pluses and minuses are distributed So the pluses and minuses are simply simple but not as simple if you if you do this systems analysis and this distribution The second one is duty ethics And people with a background in law will recognize that it has lots to do with obligations and also with rights, often also with human rights. 
 

And that perspective puts center stage, uh, human dignity and, uh, human autonomy. And that perspective looks at, Hey, this company, uh, that is, uh, creating something CHAT GTP, like what obligations does it have? Uh, and yeah, strangely in the, in the digital, uh, industry. It's a bit lacking of obligations. Suppose you were a pharmaceutics company or a, uh, a food or drinks company. 
 

You have lots of rules to comply to. You must do this test, this test, this test, and only then can you put it to the Marcet. But yeah, algorithms, you just, you just put them online and they're good. And yeah, that, that's silly in a way. So yeah, luckily the European Union is doing lots of legislation that have, uh, also an impact outside of the EU. 
 

Um, GDPR, the upcoming AI Act. Um, so that's, that's an important, uh, second, um, ethical perspective to look at duties and also rights. If we look again at CHAT GPT, there are now authors, uh, suing, uh, OpenAI, the creators of, uh, of CHAT GPT for infringing upon their copyrights because, well, yeah, as you mentioned it before, uh, they, they harvest the whole of Wikipedia, lots of books, lots of the internet, and yeah, also copyrighted books, and sometimes you can really recognize. 
 

If you, if you ask for some text output in the style of an author or, or with, uh, your, your, your, uh, images producing, uh, generative AI in the style of some illustrator or photographer, it is, yeah, almost, or for sure, and, and, uh, infringing upon their copyrights. And other rights, privacy as well. Third one to keep it shorter is relational ethics. 
 

And that also I use it as an umbrella term for, uh, several ethical approaches that. are a reaction to the first and the second ethical perspective because consequentialism of Jeremy Bentham, for example, and duty ethics of Immanuel Kant, for example, were very much, yeah, products of the European enlightenment and assumptions and ambitions had to do with rationality, independence, autonomy, uh, taken often to the extreme as if everything needs to be objective and rational and, and autonomous. 
 

And the relational ethics is a critique on that. Like it says, we're all interdependent. Uh, there needs to be justice, but also needs to be care. And that, and that justice and care need to go hand in hand. And that perspective of relational ethics, you can use very much for, uh, All these digital technologies that have an impact on how people relate to each other, how they interact to each other. 
 

The example that I've been given a couple of times, like the, the fraud detection algorithm. Does it then reduce the person to a number as if that person is not a person without a face, only a number that you send a letter to like you're a fraud or, or, or does relational ethics then enable you to, to think of a process that has much more, um, uh, looks to the human. 
 

Uh, on the other side of the algorithm, so to say, and relational ethics also borrows from, from feminist critiques and from, uh, feminist ethics, because it enables you to, uh, question power and the distribution of power. It would, for example, if you take CHAT GTP again, uh, look at, hey, is it now really so that five... 
 

US based companies own all the large language models. That's weird. As if other languages, other cultures, other continents don't matter. Uh, and yeah, that's I think also a plea for, uh, other cultures, other continents, other languages. Also, uh, Working on large language models. The fourth one, Marco, and that's my favorite one, Virtue Ethics. 
 

It ties into what I said before, looking at technology as tools. I, uh, borrow very much, uh, I was very much inspired by Shannon Veller, her work, her book, Technology and the Virtues. Uh, she understands technologies as tools that can either help people or hinder people to cultivate relevant virtues. And then there are several traditions that you can turn to. 
 

I turn to Aristotle's Virtue Ethics and he would talk about courage and self control and justice as key virtues. There are other virtues as well like creativity, curiosity, civility, honesty. I think I have 15 or so in the book that I go into some detail. And then Uh, again, look at CHAT GPT. Does it power me as a user to become more creative? 
 

Yeah, if I use it correctly, it does. And then Virtue Ethics has to do with finding an appropriate way of using this technology. Like, not too slavishly, uh, but, but, but to, to have the appropriate mean, as Aristotle would say. And finally, Virtue Ethics is, I think, very much. Um, uh, suitable for professionals and for professional ethics, because you can also turn around this virtue ethics question and ask yourself as a programmer, as a software developer, as a project manager, what virtues do I need if I work in this project with this algorithm? 
 

I need to cultivate within myself justice and fairness and honesty. If I'm working on an algorithm, it's hopefully also, uh, contributing to justice and honesty. And now to come back to your question, my next book will put relational ethics and virtue ethics center stage, how we can use technologies to better relate to each other and to live the good life better. 
 

So how we can live well together.  
 

[00:37:54] Marco Ciappelli: Wow. I, I enjoy it very much.  
 

[00:37:57] Marc Steen: That was my lecture of 10 minutes. Sorry for that, Marco.  
 

[00:38:00] Marco Ciappelli: No, no, no. I, I, I'm sure that the audience has been enjoying. I did. I mean, I'm, I love all this things that you mentioned. I'm relatively familiar with, you know, some of philosopher that I have studied in the past and, and I love how for a moment, I'm like, I had to remind myself, we're talking about technology and not Philosophy and virtues, because all these that you said, they do apply, yes, in technology, but they do apply in everything, everything else that we do, right? 
 

So, you know, are we doing the right thing? And what is the right thing? And, and I have to say that. Anything you were mentioning right now, I'm just in that mindset of zooming in and zooming out. It really is. It's, it's functioning in my head, almost like a zoom of a camera and, and I'm kind of like, yeah, you can look at this in the detail, but you can look at this in the big picture and, and you need both. 
 

That's, that's amazing.  
 

[00:38:56] Marc Steen: Yeah, yeah, yeah, exactly. Like, uh, the social media app example again, but I can think of other examples as well later on. Uh, The business model on a macro scale is often holding and grabbing and monetizing people's attention. How does it do that in the user interface? Well, with beeps and with flashes and with your, uh, endless, uh, scroll through the timeline, so they hang together. 
 

The details in user interface and the, and the larger societal, uh, aspects, uh, implications. Yeah.  
 

[00:39:27] Marco Ciappelli: And, and, and the mind manipulation, the mental addiction that you create by doing that, maybe in the, in the big picture, you don't see it because you're not getting that close, but when you go there, you're like, yeah, that's affecting my everyday life. 
 

All these beeps and, and it's like the Pavlov. bell, right? It's you become, uh, you become part of that tool and that's definitely not the good way to use it. But anyway, I really enjoyed this conversation. I, I think it'd be an interesting book, the next one that you write as well. But for now, we were talking about this one. 
 

I'm sure that you gave a lot of example, but there's much more in the book. So I want to remind people that the link to the book and the link to all that you Your social media, your website, the book website will be in the notes for this podcast. Um, I invite people to get in touch with you to read the book, of course, to subscribe to Redfin Society podcast. 
 

And, uh, stay tuned for many more conversation, uh, like this. So Marc, thank you so much, for your participation. I really enjoyed. This conversation and, uh, come back. Uh, come back when you have the next book. Um, anytime.  
 

[00:40:44] Marc Steen: Thank you. My pleasure.  
 

[00:40:47] Marco Ciappelli: All right. Thank you very much. Bye Marc. Bye everybody. Stay tuned for the next episode.