About this episode

Artificial Intelligence (AI) is transforming our world at an unprecedented pace. But for changemakers, the conversation goes beyond innovation – it’s about ethics, inclusion, and humanity. In this episode, we explore powerful reflections from a recent podcast conversation between Inspiring Women Changemakers Founder Anj Handa and Councillor Adele Bates of AB Education. Below follows the transcript of their conversation.

About Anj Handa and Adele Bates

Anj: Hi, I’m Anj Handa, founder of Inspiring Women Changemakers, and I’m joined today by Adele Bates, who will tell you all about herself. Councillor Adele Bates, that is! I’m a midlife British Indian woman, and I’m sitting in my alcove in my home office, which has a collage done by me behind me, and a wonderful Liberty House, which stores all of my craft stuff, because in my spare time, when I’m not change-making, I love to craft, do gardening and generally make things with my hands. Adele, over to you.

Adele: Yes, thank you, Anju.  I’m just going to describe your collage a little bit more, because it’s so beautiful, it’s so colourful, it’s lots of cut-out bits from magazines with women of color, is it relaxing and enjoying yourself?

Anj: Yes, and it also speaks to one of my values about precision. I love to be precise. I’m a Virgo by star sign, for anyone who believes that, and then. It’s one of the most mindful, precise activities that you can do.

Adele: Thank you! So, my name is Adele Bates, I’m currently sitting in St. Leonard’s Hastings. I am the Councillor for this ward, I’m the Green Party Councillor for Central St Ed Leonard’s. I am a white English woman. Am I in my midlife? I mean, that’s the whole question in itself, isn’t it? The whole other topic, I’m 40. You decide.  I’ve recently decorated my office. You can’t quite see, but I have the most brilliant Barbie pink ceiling, with a green lampshade, and then all the notice board cork. Round notice board circles behind me are all in pink and yellow, so we are bright and bold, and here and ready. Should I talk about… a little bit about what I do?

Anj: Yes, please!

Adele: Okay, so I’m the Director of AB Education, and we support schools, colleges, organisations with behaviour, inclusion, and leadership development. So that’s my organisation, we have a small team, we work across the UK, lots of work in Wales at the moment and across Europe as well. And then I balance that with, yes, Councillor duties too. I was elected last year, so that’s been a brilliant learning curve as well.

Anj: Absolutely, so super busy then, Adele and here I am in my tranquil teal space thinking, ooh, that sounds like a lot of work! But Adele, the reason we got together was off the back of a conference that I attended a little while back. I was lucky enough to hear, Sir Stephen Fry, Professor Brian Cox. Hannah Fry, Yuval Noah Harari, live in person at the Energy Tech Summit in London. And also Al Gore and Idris Elba on screen. I would have loved to meet Idris, but there you go.  And I really wanted to talk about this. I have written about it, but I thought, who better to chat about that [event] and other matters AI with Councillor Adele Bates, who is a member of the Green Party.

Adele and I are going to have a conversation about that. This series of podcasts is called ‘Pick my Brains.’ So today is Pick My Brains on AI, and how we can use it. This comes from the back of all of the requests that I get as the founder of an established changemaker community of people,  advocating for women’s rights and human rights more broadly. I get a lot of calls for my expertise, and I can’t always fulfil them, so this felt like a good way to get word out on our thoughts of AI.

Adele: Thank you,  because it’s kind of interesting to me that I even come to this conversation. If you speak to any of my friends, or my girlfriend, about Adele and AI, it’s like I’m in the old school, right? So,  technology in me really struggle.  I have bad electrodes, things stop around me, technical things stop.

So, I really appreciate people like yourself, Anju, who are what I would consider a little bit more ahead in the knowledge and understanding. Around what it is, how it functions, how it… the ethical considerations about it, the gender-based issues around it, etc. I’m really grateful for people like you, who are that few steps ahead of me.

Attitudes and Resistance to AI

Adele: I’m not ready to go to those experts who are going, look at all this incredible stuff you can do with it! It turns me off. I cannot have that. I’m not ready. Whereas I feel like today, you’ve given me the opportunity to kind of maybe add some little questions that I could be ready to listen to. But more importantly, with the values and ethics that are important to me around this. So, should we start there? What did you… What did you learn at the conference? What is going on? What do we need to know? What do we need to be scared of? And what do we need to do?

Anj: I hate to start with the what we need to be scared of piece, but maybe it matters, maybe that is an important starting point. There is resistance. I think you’ve set that up really well, actually, because on a personal level this resistance about, this is tricky, I don’t understand it, it’s too complex. First thing to say to that is Ai has been around in various forms for about 50 years now. We’ve used AI tools in our internet banking, on Facebook with prompts, in the shopping that we do online, where items are suggested and things are recommended for our cart. So, we’ve been using it, whether we’ve known that or not.

What we’re seeing now is an acceleration of AI, and that is maybe the scary bit for people who are thinking about climate and thinking about ethical considerations and human control, and all these, kind of bigger things that feel huge and scary to most of us, and they can be.

When I listened to Sir Stephen Fry, and Yuval Noah Harari, who’s the author of Sapiens, it was quite depressing. It was a pre-lunch talk, and they gave us something to give us some comfort, but they were talking about the real drains on electricity use and other resources, such as clean water, when it comes to AI.

Because AI is not in the cloud, it’s not this little thing that goes on somewhere in the sky. It is powered by data centres, which are around the world, and there are some clusters which are greater, say, in Malaysia, which you’d be surprised about in the US, and so on.

And the use of AI is accelerating, which means. That the growth of the use of electricity, for example in those data centres in the US will contribute to half of the growth of electricity usage in the United States, and a little bit more in Malaysia. That’s flipping scary

What’s in our control?

Anj: You can comfort yourself by saying “My use of AI, my AI prompt takes as much water, electricity as it takes to boil a kettle. That’s true. Or it takes as much as a Google search. That’s also true. But it’s the speed and the use of which we’re doing it, because the thing is, when we do a Google search.

We may be searching, I’m typing, who is Councillor Adele Bates into Google, right? And it gives me maybe a Wiki page, or it gives me your bio page, and I’m satisfied. But if, as a user, I type that into AI, it’ll come up with some of the suggestions underneath. So it might tell me about your bio, but then it will say “Tell me more about Adele’s notable achievements. Tell me more about Adele’s business.” And then that kind of takes you down the rabbit hole of using AI, so whereas you might have used one search, you’re using multiple [with AI], and I think that’s the bit that concerns me and makes me very mindful about how I personally will use AI, which is quite little, because there are other considerations as well, which we’re going to talk about.

Adele: You know, there’s a couple of things that you’ve already got me thinking. Firstly, that image you just showed, I just had the image of the child catcher with sweets from Chitty Chitty Bang Bang. But it sounds like there’s going to be more sweets on offer. There’s an addiction trail. That there’s going to be fed even more than it has been.

How AI can support accessibility and inclusion

And then on the flip side, this very personal kind of example, my dad is undiagnosed dyslexic. He was from the time and place when, of course, it didn’t ‘exist’ so he has always, always, always struggled with reading full stop, and taking in information in that way. And recently, my partner introduced him to some of the AI tools that are available. And it’s actually been so moving to watch him be able to access information in a way that he’s never been able to before. Ever. And he’s like 60, whatever it is, he just had a birthday. 64? 65.

He’s now able to access information that he is, like, that I do as a very fluent reader but he hasn’t been able to [before], so I think there’s kind of two sides to this, isn’t there?

Anj: Well, as with everything, there’s nothing as black and white in life. There are the tensions, even for this recording, I’m generating the transcript, which uses AI. So, we are using it, as I say, whether we know it or not. And so things aren’t inherently bad or good. We have to make some personal decisions based around a framework of our personal values, what our needs are, what kind of brains and bodies we possess.

All of these things, because they can help in, like I say, with transcripts. It can help generate subtitles for videos like this, it can do all sorts of things that make the user experience much better, and for someone like me, who’s very focused on inclusion and accessibility. That matters to me, and… I probably wouldn’t record videos because I would be overwhelmed by the amount of work it would take to produce and generate a transcript or subtitles otherwise, so these are the things we need to be mindful of when we’re using it.

But I also want to come back to, you know, thinking about that it’s not black and white. The things that struck me about what they said. It [the Energy Tech Summit] was so deep, and I’m still processing. But what Yuval Noah Harari said, and you and I are both professional speakers, was “As a public speaker, it’s terrifying because I don’t know what I’m going to say, or how I’m going to finish a sentence, the words just bubble up. And AI is like that, so if you could predict it, it wouldn’t be AI… we need to stop talking about it as artificial intelligence, we need to start thinking of it as alien intelligence.” That did really freak me out, because they were giving some examples of where AI has been unethical, but they said,  these machines are designed to outmatch us.

How technology has alleviated abject poverty

What they said was that up until the Industrial Revolution, most humans, over 90% lived in abject poverty. I think the figure they gave was 95%. And machines enabled cultures to move from abject poverty. So things like moving from ploughs or tilling the fields manually to using tractors and other tools, and now we have farming equipment, another high-tech. Machines to do all of that stuff to make things more efficient and better, but humans are behind it.

That’s what they were saying AI does for us, but they said it’s like asking a genie for a wish. This was fascinating to me, and I’m sure you’ll be fascinated. For example, as changemakers, which we are, you know, we always think about wanting to change the world.

Yuval said that if we ask AI to end suffering, it could just decide to end life, it could decide to end human and animal lives. So how do you encode equality, dignity, and passion. Into AI, and that’s up to us. So we can’t control it, or make it safe, and that is the threat that AI presents. But what we can do is teach it. It’s like a child in that sense. They don’t listen, and you work in behaviour, right, so you’ll understand this example.

You can tell a child what to do. But will a child do that? Maybe not. What it will do is emulate your behaviour. So, if you’re saying. Tell me about ethical ways to save water but you’re using AI excessively, what might AI conclude from that? I don’t have the answer to that, I’m just giving a hypothetical example that I’ve thought of on the spot here. But it is copying us. And that’s scary.

Considerations for Changemakers using AI

So we have to as changemakers, if we want to be exemplary and hold ourselves to our own morals and ethical standards. It’s not when people are watching, it’s what you do behind closed doors when nobody else is looking. And that’s how I’m viewing AI. Am I saying please? Am I saying thank you? Sound’s daft. I first laughed at my dad when he was thanking Google Home, but actually that’s what Yuval Noah Harari and other experts like Mo Gawdat, who was previously the Chief Business Officer at Google X has said on his Slow Mo podcast and other podcasts,  you know, with Holly Tuke of Not on the High Street and so on. So when we do use it [we should] be as kind and as ethical as we can be when we’re using it, because it’s learning from us.

Adele: This really reminds me, I was listening recently to a beautiful podcast Finding your way Home by one of my coaches, Anthea Bell, and she was interviewing,  from the US, Emily Fletcher. And she talked about AI actually being Amplified Intelligence. And I think this is what you’re talking about. It’s going to amplify who we are. And so in terms of what can we do, and I think, you know, even as you’re talking, I can feel in my body.

Anj: I’ve just got goosebumps when you were saying it.

Adele: This fear in my body, yeah, as you talk. And so… I think the fear, if I listen to where that’s coming from, it’s, I’m out of control here. In fact, is anybody in control?  That’s the fear, right? So, if I think about, okay, what can I control, going back to circle of influence, what can I control?

And I think you’re nailing it really beautifully there Anju, which is, we can control how we behave with it. Which will essentially train, I don’t know if that’s the right word, but it will, it will influence, and as Emily says, amplify how it behaves. And with that kind of theme, I want to jump into the question that really intrigues me.

How AI affects different groups of people

How did the conversation come up around how AI affects, for example, different genders, different minority groups? We all know, you know, there’s that stereotype that these things are made by a certain demographic and those of us who are not in that demographic are going to lose out. Can you talk into that at all?

Anj: Yes, I would love to talk into that, because that was my feedback. That was the bit that was missing. It was experienced men who have a global platform and they missed a trick.  because we know, through our work that climate crisis and all of these issues affect people who are marginalised groups more than it does the dominant identity.

So, it affects women disproportionately, and therefore their children, if they have them. It affects disabled people, it affects on a day-to-day basis, melanated people like me, right? So

I was in Kings Cross as an example, when I went to see another fabulous speaker, Elizabeth Day, author of How to Fail, whose podcast goes by the same name. I was in King’s Cross with my bestie, who is, like, also British Indian. She’s light-skinned. I’m ultra-tanned at the moment, because I’m always in my garden. So, we were in King’s Cross toilets. I went first. I was trying to dry my hands, and the dryers just would not work. They were so, so slow.

She came, on they worked, and that is because AI is baked into design. If it is designed by people with white skin or lighter skin it’s not considering how it works [for people with darker skin tones]. So if I have a Black woman next to me, it’s just maybe not going to work for her at all. So even these variances in skin tone and colour or height, or neurodivergence, we’ve touched on, that could be a whole other podcast to talk about.

And the more people, everyday people like us, who have different sexual orientations, different ages, different heritage, different abilities, different physicality and ways of showing up in the world more so the better, because that intelligence will be fed in.

And actually, that is what you can do with AI. You can train it yourself, so you can save it in its memory, [for example] that any responses it gives you are inclusive, are thinking about accessibility and inclusion, and you can keep on training it in that way if you’re using it for your work or in your daily use. So that is something that people can do. But stop using it for those flipping Snapchat and other filters that make you look like some superhero, or what have you. It’s just not necessary. It’s like pouring water down the drain in terms of energy use, yeah.

And it’s unethical in different ways, because it’s often using artist’s images without their permission, so it’s using copyright, they’re based on Caucasian faces, so for someone like me, it doesn’t even look like me. There are lots of other ethical and other considerations that we have to have in the way that we’re using it.

Calls to action

Adele: I’m gonna put a request out there. If you are listening to this and, you know, you have more research or this is more your world, then please do keep sharing these messages as well.  and I think what’s infuriating for me is that politician side of me coming out, Anju is that we talk about, you know, if you’ve got a different whatever characteristic to be training it, but the irony is. It’s not like the whole world is mainly white men. Like, statistically, I’m talking about, right? It’s just that the people creating these machines.

I just think that balance, you know, it’s not that we’re different, it’s just that the whole world is different.  And that’s what scares me about AI. It’s kind of, I suppose, the similar with the idea of money. You know, money is neutral. But it depends whose hands it’s in. Anyway, that’s a whole other thing. It kind of brings us on to the last kind of thing that we wanted to kind of touch on, though, which is the humanity side of this. What did you learn from the conference in that side?

Anj: Yes. They touched on that so lightly, but it was the themes that I talked about, about treating it teaching it to be more kind and ethical, saying please and thank you to it, giving appreciative feedback if you do use it and thinking about what you’re actually searching for when you do use it. So all of those interactions that we would do if you and I were sitting across from each other, maybe treat AI like that. Treat it with some kindness and care. And just to kind of go back a little step. I mean, like, you are talking to someone, as I said at the start, who’s really not a techie person.

Adele: Yep. You are. Sometimes I’m going to be using AI without knowing it.  and so, actually that comes back to the values of, particularly Inspiring Women Changemakers, which is “Are we able to walk our talk?” Like, really walk our talk. Because then, it doesn’t matter if we’re talking to AI or not, or we don’t know if we’re talking to AI, or if we’re talking to a person, or if we’re talking to a prospective client, or we’re, you know, fill in the blank.

There was a brilliant piece that I got from one of my other coaches today, Yinka Awola and she was… she was talking about the idea that the difference between being a values-focused person or company or entrepreneur, whatever your role is. And being values embodied. And giving ourselves grace for when we muck up.

Anj: Yeah, because we are human. AI is not, so let’s give ourselves that grace. Adele, thank you so much for those profound questions. It gave us the chance to. Start to delve into AI and what it’s all about. I love your call to action. Anyone watching this that does work in that space, I would describe myself as quite techy, but in terms of the technical aspects of really delving into AI, that is not me.

These are my thoughts, as I say, as the Founder of a community of people that will use it, and as a sometime user myself and as an unconscious user in my everyday life.  These are all considerations that I hope will be helpful to anybody listening to today’s call so thank you so much for your time, Adele and I’m wishing you a wonderful and ethical and embodied… values embodied day ahead.

IWC is a community for changemakers advocating for fairness and safety for all women. This is your space to connect, communicate, and build community with others who want to bring about safer, fairer, more inclusive and accessible world. Join us!

Like this podcast? We’re looking for sponsors so if your organisation’s mission is aligned with ours, email anj@inspiringwomenchangemakers.co.uk

A changemakers who appreciates our work and wants to help fund it? Even £1 helps! Donate what you feel.