1 minute read

Each day up until the AI for Good Summit in Geneva on May 15 I’m writing up a thought on how Artificial Intelligence could impact on each UN Sustainable Development Goal. (Go to the first post.)

AI XPRIZE Suggestion: “ By identifying and correcting for gender bias, further automat­ing/augmenting tasks, AI is empowering women for growth and new opportunities.” — AI XPRIZE

A Long Way To Go

The problem of bias in AI/ML algorithms has started to be reported and discussed outside academia but it has been an important topic of discussion for the past few years at least in academia as well.

The core problem comes from the fact that Artificial Intelligence is artificial because we make it. Whether the system is built out of hand coded rules or learned from vast data repositories, humans are in the loop at every step of the way. The “revelation” that many machine learning algorithms produce results that imply racist or sexist conclusions can be explained by the fact that humans choose the data to train those models. Even subtle, implicit biases can effect the data used, the algorithms chosen, the results selected for reporting.

Last year Prof. Kate Crawford gave a keynote talk at one of the top global Machine Learning conferences Neural Information Processing Systems about this issue and the technical and social challenges of removing human bias from our automated algorithms.

Some of the solutions to this problem could include technical ones, such as stricter protocols for selecting data or double blind design of training, validation and testing scenarios. But a large part of the solution must come from greater diversity amongst actual flesh-and-blood data analysts, AI/ML researchers, developers and educators. If the people teaching, innovating and building AI systems are as diverse as humanity there is a better chance that the result AI systems will at least average the biases of all of humanity.

Mark Crowley has no official affiliation with IBM, XPrize, ITU or the UN. The views and opinions expressed here are entirely his own.

Updated: