In this week’s email
What’s on our mind: How AI perpetuates harmful stereotypes
Join our next free webinar on 16 April: How to be more inclusive at work
Share our inclusion insights with your team
The latest episode of Techish
Free inclusion resources
How AI perpetuates harmful stereotypes
“Before we work on artificial intelligence, why don’t we do something about natural stupidity?”
Steve Polyak
Spreading false information, leading with ego, a lack of social responsibility, and preaching what we don’t practice. These are just a few examples of ‘stupid’ behaviour and, no, we aren’t just listing traits of problematic politicians. To echo Polyak, there are an awful lot of ‘stupid’ things that we, as a society, should work on before AI. However, the truth is that we place a lot of importance on AI and its relationship with our future society.
We’ve all seen the movies about the dangers of technology, right? Impressive as AI may be, there’s something about technology having a mind of its own that sparks concern. As humans, we have a real inability to understand that machines are limited in their view. In reality, there are only so many problems that AI can solve.
Photo by Choong Deng Xiang on Unsplash
Take the ongoing battle against entrenched stereotypes and systematic racism, for instance. AI hasn’t managed to unravel those old chestnuts – it’s encouraged them.
For example, the Rest of the World reported that ‘search results [and] facial recognition systems’ perform poorly on ‘Black faces in comparison to white faces’. Yikes. The Association for Progressive Communications wrote that:
“In our haste to embrace these innovative technologies, we risk perpetuating the very prejudices we seek to eliminate. Whether it is encouraging systemic racism, gender inequality or ableism, AI systems have the potential to cause immeasurable harm if left unchecked, the consequences of which could be disastrous.”
Recent research into generative artificial intelligence images has proven that generative AI actively ‘repeats’ racial stereotypes. They asked the bot to produce images of Black doctors - all they produced were images of white men in African outfits.
Furthermore, when asked to project doctors assisting children in Africa, the tool adorned the patients with ‘exaggerated and culturally offensive African elements, such as wildlife’. All African patients go to the GP with a giraffe in tow, naturally…
The question is, how do we actively avoid these biases when they’re subconsciously fed to us on the daily?
Photo by Kasia Derenda on Unsplash
Stay in the loop: Adopting human-in-the-loop systems for your business is a simple way to check if the AI systems are repeating stereotypes or not. They give companies the gift of choice, which can help eradicate biases. Has the tool you're using spat up a stereotype? Human-in-the-loop systems give you the chance to redirect the outcome.
Train your pen: If you use AI writing tools then you need to be aware of the past data the said tool may be drawing from. Like any other AI platform, AI writing apps often replicate the biases from previous (often biased) data. Making sure that you use multiple sources, varied voices and differing perspectives when you write is incredibly important.
Consider when AI is really needed: As the research shows, there's still a long way to go until AI can accurately replicate real-world representations. So, until then, using a human eye in some workplace scenarios may be the necessary route to avoid unwanted biases.
For more information on challenging stereotypes perpetuated by AI, visit our Insights page or listen to Techish.
Sign up for our next webinar: how to be more inclusive at work on Tuesday 16 April 1pm BST
If you and your team are looking for ways to be more inclusive in your daily interactions with other colleagues, clients, and job candidates, this training is for you! Grab a spot by registering here.
Share our inclusion insights with your team
We’ve been in the business of inclusion education for over seven years! Whatever cultural challenge you’re facing in your team, we have free advice to help. Check out our insights page or book a free consultation call with Abadesi.
Brand new #Techish, listen to the latest episode
Listen to one of the UK’s Top podcasts — co-hosts Michael and Abadesi dive into the essential stories across tech and pop culture🍿💰🎧.
Are you an iPhone user? Leave us a review on Apple Podcasts for a chance to get a shoutout on the show!