The unstable foundations of exclusive AI

A lot of Great Women work at Great State. Each month we get together to talk about what it’s like to be a woman in the workplace, at Great State, a woman in tech.

We create a safe place to share our experiences and learn from each other, sometimes we get an external speaker or we discuss a relevant book, but earlier this year, we attended the We Are Tech Women ‘One Tech World’ conference. A global, online conference aimed at supporting female technologists through a series of talks, panel discussions and networking opportunities, all covering a wide range of topics.

Sure, it was held in late Spring, but one of the talks is still playing on my mind. I tuned in to hear Technical Director at National Cyber Security Centre Dr Ian Levy’s talk ‘Cyber 2022: What's coming down the track?’. During Levy’s interesting overview on the trends emerging in cyber security, he highlighted that coupled with increasing technological development and reliance comes the increasing risk of technology ‘going wrong’. The more dependent we are on technology, the more we need to be able to trust it to not only serve its function but also for it to support our values.

What happens if bias is built in?

AI in particular is an area rapidly advancing in bringing technology to the centre of organisations and lives all over the world. Dr Levy raised this point: “AI is only as good as its training data, and the training data we have for most AI use cases is massively biased”. If his observation is correct, then it means we are witnessing the unconscious biases of human developers actively being embedded into new technologies. How do we ensure the tech will uphold the right values? AI will, and in some spaces already is, changing the world for the better but it still needs to be approached with caution. If the data from which AI is learning is biased, then we’ll be using tech that works against user behaviours not deemed as being typically ‘white’ or typically male owing to the over-representation of these demographics within the data.

The under-representation of women or people of colour in data can lead to dangerous biases within AI and machine-learnt decision making. Research carried out around pedestrian identification used with autonomous cars highlights the struggle these ‘machine vision’ systems have with recognising pedestrians with different skin tones. Pedestrians with lighter skin tones were consistently better identified than those with darker skin tones. Big findings like this highlight the importance of getting our tech right because, as Levy stated in his talk, “if we don't get this right, you're going to have autonomous vehicles which are more likely to run down a Black pedestrian than a white pedestrian”. In another example, tech giant, Amazon, fell foul to gender bias with their AI led recruitment tool which was trained on the data submitted by applicants over a 10-year period. It came to ‘learn’ that male applicants were more desirable than female ones which had a negative impact on CVs which included the word ‘women’ and active decrease in the number of applications from women advancing through the recruitment process. This was because its training data was derived mostly from men.  

Building foundations on inclusivity

AI is undoubtedly exciting and interesting and when created by balanced teams with balanced sources, based on true societal factors, rather than just the maths, it can do amazing things. But for this to happen our data must reflect the true diverse nature of our world. Ensuring tech companies and agencies develop diverse and inclusive teams is one positive step with positive impact. As, Dr Ian Levy said – “if everyone looks and thinks like me, we are going to fail.”.

Whilst we’re talking about diversity within tech, I should let you know that we’re hiring. We design digital products and services for a diverse world, and so we want a diverse team who enable us to produce the best work for our clients. We don’t just celebrate differences; we want a team that is built upon them. A team that represents all sexual orientations, ethnicities, faiths, and gender expressions. Every person is welcome at Great State and we offer the appropriate support throughout anyone’s journey with us, so come on in.

Related articles