In Episode 2 we speak with Minter Dial about the future of AI and Ethics.
In each bi-weekly episode, we provide a look at what’s new, what’s next and what you need to do next week to survive in this digital world, as told through the eyes of global experts.
You’ll find every episode full of practical ideas and answers to the question, “What’s the future of … ?” with voices and opinions that need to be heard.
But beware, I’m no ordinary futurist, and along with my guests, we’ll give you things you can use in your business next week, not next year.
You can listen to the podcast below, or via your favourite podcast app. Links to the most popular ones are below or simply search for “Actionable Futurist Podcast” on your favourite platform. If you can’t find it where you listen to Podcasts, please contact me, and we can get it listed there.
In this show we covered a range of topics:
- What is “Heartificial empathy”?
- The need for empathy in business
- Can we create empathy in a machine?
- Feeling vs cognitive empathy
- Ethics in AI
- Avoiding conscious bias
- How can we learn to be more empathetic?
- The case for reading great novels
- Empathy as a key competitive advantage
- The rise of the “Empathy Index”
- 3 Practical tips for next week
02:40 Heartificial empathy – an introduction
03:06 The need for empathy in business
04:36 Can we create empathy in a machine?
06:05 Feeling vs cognitive empathy
09:19 Ethics in AI
11:27 Avoiding conscious bias
13:25 How can we learn to be more empathetic?
14:29 The case for reading great novels
15:19 Empathy as a key competitive advantage
18:02 The rise of the “Empathy Index”
19:46 Key learnings from the book
20:59 3 Practical tips for next week
22:30 Where can people find out more about Minter?
More about Minter
Andrew Grill: 00:03 Welcome to the Actionable Futurist podcast, a show all about the near term future with practical tips and tricks from a range of global experts. I’m your host Andrew Grill. You’ll find every episode full of practical ideas and answers to the question what’s the future of with voices and opinions that need to be heard, but beware, I’m no ordinary futurist and my guests will give you things you can use in your business next week, not next year.
Andrew Grill: 00:35 So let’s jump into it. In this episode. What’s the future of Ai and ethics? I launched this podcast a few weeks ago and the feedback’s been extremely positive. Thanks everyone for listening or welcome if it’s your first time here. Before I introduce my next guest, I wanted to outline my own point of view about this very interesting area of empathy and ethics when it comes to artificial intelligence. With the rise of artificial intelligence across all industries, commentators and business leaders are now questioning the ethics around these AI systems. While existing AI systems are a long way from being able to simulate human behaviour or general AI as it’s being called. Many are worried about how we will program these machines to work for us instead of against us. In almost every one of my keynotes, I’m asked about Ai specifically, will we lose our jobs and can we trust these systems.
Andrew Grill: 01:28 In each case, I explain that AI systems need to be trained by humans initially and how we train these systems will direct how empathetic they might be.
Minter Dial: 01:36 At the end of the day, I do believe that a machine will be able to perceive other human beings, well, sometimes better than us.
Andrew Grill: 01:46 That’s the voice of today’s guest, long-term friend of mine, Minter Dial who has just written a new book called heart “Heartificial Empathy” where he tackles this very topic. He argues that as humans we need to become more empathetic before we can hope to train these new AI systems and that empathy is the superglue for high performing teams. So who is coding our AI and do they have real empathy and ethics in their approach? We also need to have more empathy to be better managers and learn to listen better. How can we create empathy in machines? Minter argues that empathy and ethics are linked.
Andrew Grill: 02:24 Welcome to the Actionable Futurist podcast, episode number two, where I’m joined by bestselling author, storyteller, filmmaker, blogger, keynote speaker, brand strategist, podcaster, and also my friend Minter Dial. Minter, welcome.
Minter Dial: 02:39 Andrew. Thank you so much for having me on the show.
Andrew Grill: 02:41 Heartifical Empathy, putting heart into business and artificial intelligence, amazing title, amazing book. The book is an in depth look at empathy, how it’s created, why and how to increase empathy in people, organizations and machines and the floors to be avoided. What drove you to write it?
Minter Dial: 02:57 Let’s say the topical answer is that I think empathy has long been an interesting topic for business. It’s not something that we’ve regularly talked about. It’s certainly not something that you practice or teach in business schools and yet it’s, it’s fundamental to so many parts of the business starting with the way we manage our people. It’s just startling how the idea you get on a tie, you go to the office and you treat people differently. You don’t have time to listen – rush rush got to get everything done. And by doing so fast, everything we forget to listen. We forget to understand that people have personal motivations and personal issues. It just, my experience said that, well, being empathic can be a really, a magical skill. Let’s say a topic that I really felt I wanted to put on the page and make it not just a touchy feely thing or soft skills.
Minter Dial: 03:54 Somebody that really actually materially will change the course of your business if you learn the power of empathy. So that, that was may say topical answer. The, underneath answer actually was that I started to look in the mirror about how empathic I truly was and said, well, can I, could I do more? Can I be better at being empathic? And then the irony of the story is that once I really learned about it, I did understand that I wasn’t always being empathic,
Andrew Grill: 04:21 You didn’t like yourself …
Minter Dial: 04:22 Well, I certainly recognise that I could be more empathic. And now that I’ve written a book, the challenge is holding myself up to that standard.
Andrew Grill: 04:30 Read your book Minter! So do you think artificial empathy is an oxymoron? How can we create empathy in the machine?
Minter Dial: 04:38 Right? So there’s an oxymoronic element to it.
Minter Dial: 04:43 But the reality is empathy is about perception and machines are increasingly tremendously capable of perceiving. So whether it’s vocal, visual, oral, we can now perceive emotions. We can perceive what’s happening a lot better. And so in the end of the day, I do believe that a machine will be able to perceive other human beings. Well sometimes better than us. You take the case of a doctor and their ability to detect depression in an individual, as individuals, we, we tend to sort of sometimes say what the other person wants to hear. And the same thing actually happens with doctors. I mean doctors have their own filters and they’re not quite as able to pick up the signs of depression. For example, take an example of does the depressed person laugh? And the answer is yes. And so you might miss cute laughter with happiness, but it turns out that when a depressed person laughs, the length of their laughter is shorter. If you can cue a machine to detect what’s the difference between, you know, a hearty laugh and a depressed laugh, that kind of sensitivity is something that machines do. That’s not empathy per se, but it does show the detection. This ability to perceive and, and where people get confused is that you don’t need to show the empathy per se. At the one level, it’s really just about understanding the other person’s context, at least in a cognitive manner.
Andrew Grill: 06:19 Almost having … the machine could say this person is less happy than normal, so the human then goes, okay, I’ve got to treat them differently. I wasn’t aware. They look less happy than normal. So that’s an aid to then have me as a human turn on more empathy.
Minter Dial: 06:32 Well, it is about helping, prompting the human to act. I tend to conscript the idea of empathy to the perception component. There are two elements. Let’s say two definitions broadly speaking of empathy. One is feeling affective empathy and the other one is cognitive empathy, thinking empathy if you will. And the the feeling one is not something that I think human machines are going to get where if you start crying I cry, you know, or you know, I feel your sadness. That is a, that’s a, that’s not the domain of machines, but in the cognitive space, the ability to say, Andrew, you look sad or needs to perceive your sadness that the machine is able to do. Then the question is what are you going to do with it? And that totally depends on the context of Andrew because Andrew may not be looking for sympathy. He just made me sad because his team lost and that’s it. And, and well, inshallah or he might be sad for another reason, but he knows the solution. So I, he just not looking for me to give them advice. Just wants to listen. He just wants to have somebody listen to them. Yeah. Yeah.
Minter Dial: 07:41 Yesterday I was doing a keynote and I hung around for lunch and before lunch there were these stations who are talking about things. And almost opposite of the wellness station. So I heard the same pitch three or four times. And what was fascinating, the ladies were saying, someone comes into work and, and you ask them how was their weekend? Or how are they and your condition? Oh fine. She said, are you ready when someone says, no, I actually had a really bad weekend. What do you do? I don’t think we’re trained in a work environment to do that. Oh, that’s a bit uncomfortable. Um, Minter’s had a bad day or Andrew’s had a bad day, what do we do, and then then must say, oh, it’ll be okay. And sometimes, especially in a work environment, you just want someone to listen.
Minter Dial: 08:16 That’s right. I think actually in all environments we could do with lot better listening skills. I mean, the reality is that we all have 24 hours and there is this perception that time has accelerated and yet it hasn’t. We do need, there are different ways to be more efficient. We can do so many more things. We’ll have digital tools, but in the advancement of our technologies, we’ve kind of lost our ability to sit, listen to ourselves. Our heartbeat, the breathing of our lungs and listening to other people and so there’s in the first part is actually listen to yourself, self empathy, a self awareness and the other one is around with the important people around you and whether it’s at work or at home, the ability just to say, hey, you want to talk? Let’s just go in and be quiet. When you know that every minute is a dollar, then we tend to equate that with productivity and that just flushes out and pushes out any desire to listen.
Andrew Grill: 09:18 I want to talk now about ethics because when we talk about AI I get asked all the time about the ethics and I have a set of standard responses that I give, but I’m keen to learn more. You know, what’s should companies do when they’re thinking about the ethics of Ai?
Minter Dial: 09:31 Well. So empathy and ethics, it turns out are extremely linked. And if you want to encode AI with empathy, for example, you really need to have a saying, if you want to look at your ethics, how about taking a check on how empathic you are as an individual, as a c- suite and as an organization? And if that empathy is there, then let’s say that you’re in a better state to create an ethical framework before you go forward. Afterwards. The issue is understanding the pressure you have to perform and whether you’re able to defray that for the sake of a stronger ethical line. The issue with ethics is that it’s a very personal story. The difference between what is good and what is bad, and when you have a large team, even a small team for that matter, your ability to coalesce and to agree ensemble about an ethical line can be very deeply personal.
Minter Dial: 10:28 And so when you have empathy, it’s going to be easier for you to understand each person’s zones and think also more importantly about other people’s zones. Because when you’re a bunch of white men sitting around the table, chances are you’re going to think white man’s stuff and white men, you know, whatever I’ve had is my privilege. And yet we might have a customer base that is deeply very different in terms of background or, or sex or whatever gender and and so the, the notion of empathy is a key consideration afterwards into, in terms of ethics. The reality is a lot of the ethical conundrums we’re going to faced, there are no laws to understand or run by. And so we’re going to have to be in constant mode of you know, adapting and rethinking our ethical frameworks, which is probably why I mentioned the notion of privacy before. I think today we can do so many things, but it’s not because we can, that we should do them.
Andrew Grill: 11:26 When we talk about ethics and AI, often the, the notion of conscious bias comes up that if you have to program and train a machine, you’re going to train in a certain way. So where does conscious bias fit with empathy and ethics and, and there are no laws at the moment, but all the people that are developing AI platforms and consumers also are going to start asking who programmed my machine?
Minter Dial: 11:47 Well, let me also just add that is it’s that that AI is going to be used by criminal organizations, by states in different ways and as well as companies and cities for that matter. So there are many different organizations are going to be using it, putting empathy into the way you approach your AI or you’re, this bias you have is going to help you to look at it from other perspectives and put yourselves in the shoes. The others, I’m hard pressed to say that there’s one root in order to do this. The challenge is you, you each perform, you want to get the data sets, getting the data set, is proprietary, it can take a long time. There are no shortcuts. You’re going to screw up along the way if you just keep your eye out for what you think is doing good for society and be vigilant about that ongoing.
Minter Dial: 12:45 It’ll be important. An example would be as you look about programming your Ai, who’s on the team is doing it because you can have coders and coders have many talents and one particular skill. But usually within that is not a strong emotional quotient. So make sure you try to compensate or compliment anyway with people who have maybe a stronger humanitarian approach, more sociological understanding, maybe stronger emotional quotient and that might be a good way to make sure you alongside the lawyer, they’re good, good ethics, diversity of ethics.
Andrew Grill: 13:25 In the book he make the case for why empathy is not only teachable but a requirement for success in business and in life. So how can we teach empathy?
Minter Dial: 13:32 Well, so I don’t actually believe that empathy is teachable per se. It must be learned. So the key is to create an environment where people want to come empathic. So the first part of that is making sure that empathy is modeled is the behavior up top because it’s no good telling the rest of your team to be empathic when you are running dick.
Minter Dial: 13:59 And that means having self awareness and evaluating stuff from the top. Secondly, empathy is a great way to be with your customers. However, if as an organization you’r e unempathic internally, it’s quite unlikely that the empathy will continue to manifest itself towards the customers. So creating environment means modeling it from the top and then there are different ways according to the amount of empathy you think you have to foster more empathy. One of the ways, and I strongly encourage you obviously as an author is read great novels by reading great novels, it’s been proven that you are going to step into the shoes of other people. The character’s going to be this crazy man or a person or a woman and you’re going to learn through great writing the psychology of that person and that gets you into their shoes. So I personally now have alternating every book I read to be a novel, which gets me into another, another space.
Minter Dial: 14:56 It’s giving me quiet time as well, but it’s also making my, my brain expand in other people’s worlds.
Andrew Grill: 15:02 I hadn’t thought about that, and that’s so true that I read so many business books and I need some escapism and you’re right, you need to get into characters that it is probably the best bit of advice ‘ve heard all year. Why will empathy be a key competitive advantage? I think you’ve almost answered that. But people out there that are not convinced, you can’t put a dollar figure on having better empathy, why should they bother?
Minter Dial: 15:13 Right? So at the very least customers are going to want it. When you hear the number of customers complaining about the automated, this automated that and the inability to code or the right user experience. Empathy is knowing how to design. Any great designer has strong empathy. But if that designer is surrounded by rational, hard nosed unempathic individuals, it’s not going to be good.
Minter Dial: 15:46 So for customer facing components, whether it’s customer service, design of a product, managing the sales experience, being empathic, understanding the situation is going to be a material benefit to your bottom line. After that. And I think, but probably actually really in chronological order before that, if having good talent is important to you and keeping them there with you. I think empathy is the superglue. It’s the thing that’s going to help you identify when your employees are unhappy. It’s going to help you to understand what their motivations are and then play towards those emotions, their motivations, and, and ultimately make a better environment where people want to continue to work for you.
Andrew Grill: 16:33 I’m just dreaming about an organization I’ve worked at before, knowing that they would probably send people to “empathy school” and they would tick a box of, they’ve done their empathy training and they come out and be, be the Dick’s ever again.
Minter Dial: 16:43 Well, I mean the very least they’re trying, you know, maybe they’re also recognizing that the’re not, and let’s say so on that good news. But the bottom line is empathy is something that has to happen in the small details every day. And, and so it actually, it’s quite tiring because it means sometimes taking the time to listen, say, hey, how are you doing Andrew? I’m not doing well.
Andrew Grill: 17:04 Oh Wow, let’s, yeah, let’s, let’s stop what we’re doing. Going to have a coffee or take the day off.
Minter Dial: 17:09 Then someone, my boss yells, mentioned, what are you doing? I’m like, well, you know, I’m trying to do this. Oh, okay. And so in the book I try to tease out a few of these typically situations that happen in business and how we typically do things and how we could do them differently.
Andrew Grill: 17:26 I remember one of the team I was managing one day, let’s go to the Science Museum. We spent three hours there and over that time, because we weren’t focused on what our work package was that day and we were looking at interesting exhibits, we had a chance to talk and I think both of us really found that rewarding. Now if my 1-up manager knew that I’d taken my, one of my team for the science museum. They may have thought, why? But an empathetic, manager would go, `”what a great idea”. I’m going to take my team to the science museum too. Do you think we’ll start to see the rise of the empathy index so consumers can see if one company is more empathetic than the other?
Minter Dial: 18:01 Well, so there, there is an organization in the south of England who’s created a few years ago and empathy index. The challenge with that is really measuring empathy. Well that was another question. Can you measure it? It’s very difficult.
Minter Dial: 18:13 Um, cvs in the United States, Norman de Greve who just was identified as one of the top hundred courageous leaders in the United States. They really tried to put empathy into their customer experience and what they’ve done is they’ve measured it, but the only way the measured is they ask individuals that come into their stores, the CVS drug store. Did you feel that the salesperson demonstrated empathy towards you? And each person has a different interpretation of what is empathy and so on. So it’s very hard. That’s, that was sort of the the first level then, you try to surround sound it with different characteristics that would show that the person’s being empathic. It’s hard to do. So a) I would say, why not have empathy index? Because at least it puts it on the table and challenges, what are you trying to do behind it and how scientific is it?
Minter Dial: 19:02 But I will say this, this index that was created in 2015 it identified 170 companies that had, or the on the sympathy index that we’re publicly traded and they weren’t able to qualify them. Some large, maybe 50 different types of criteria in order to try to establish the empathy level of that organization. The top 10 versus the bottom 10 outperformed on the stock market by two times. So that would be some kind of indication. So if you’re still curious or dubious about whether empathy can be useful for you, there seems to be material proof that will help you on the bottom line.
Andrew Grill: 19:41 So great book. What are the top three things that you want people to take away from the book?
Speaker 3: 20:04 Well, so the top three things, the first is think about your own level of empathy and start with that self awareness. The second is think about how empathy can be a useful thing in our divisive society. And today we have so many issues out there in every country, different political problems and and societal problems, immigration, employment and so on. And I think that empathy is something that could be very useful for us, not just in business but in society. And the third is, as you look towards the idea of encoding artificial intelligence, where you might consider emotions and more specifically empathy, it’s a great opportunity to reflect on what is empathy, what is your definition of empathy? Because in the end of the day, you need to know that before you start coding it. So it ends up being a mirror for who you are and what you’re trying to achieve. And look at that as a good reflective moment because ultimately, while you might try to delegate empathy, the reality is it has to start with you.
Andrew Grill: 20:55 So as this is the Actionable Futurist podcast, what can listeners do next week? What are three things they can do next week to be more empathetic or on on that journey?
Minter Dial: 21:05 All right, so the first is try to find a stranger you don’t know and ask them a few questions about them. Who they are with their lives can be like a bus driver or someone manning the till. Second thing is break out a novel. Read a good classic novel. Something you haven’t read, you’ll find it, hopefully rather entertaining. And the third thing is look inside your business and in what you’re doing in your business practice and see where you can strategically try to be more understanding of people that are different from you, specifically your customers. Try to be in the shoes of your customer. For example, call your customer service, not with your telephone number, but with someone else’s telephone number so that they can’t recognize that you’re an employee of that company and ask, “Hey, listen, I’d like to have a customer service problem solved” and see how that experience is.
Minter Dial: 21:56 Put yourself in the shoes of the customer, legitimately walk into a store, or if you’re in a retail place or order from your ecommerce site, do something that puts you in the shoes of the customer and feel their pain,
Andrew Grill: 22:12 But also do it for your competitor to see if they’re any better than you are, and learn. Amazing discussion. I’d love to also have you back and talk about podcasting. You’ve done 328 podcasts since November, 2010 can we have you back to talk about podcasting?
Minter Dial: 22:26 Sure.
Andrew Grill: 22:27 Look, thank you so much for your wisdom on all of these topics today. Where can people find out more about you and your work?
Minter Dial: 22:33 So my general toy land is on my own site, minterdial.com I enjoy trying to do things on Twitter as well @mdial my books are heartartificialempathy.com, futureproof.ly and then I’ve also done this other book on the Second World War – its a personal family story called thelastringhome.com.
Andrew Grill: 22:55 Fantastic. This has been the Actionable Futurist podcast. You can find all of our previous shows at futurist.london and if you like what you’ve heard on the show, please consider subscribing via your favorite podcast platform. You can also hire me to speak at your next management offsite or customer event. More details on what I speak about with video replays can also be found at futurist.london.
Until next time, I’m The Actionable Futurist®, Andrew Grill.