Racism in AI and Tech: Self-driving cars are blind to dark-skinned people, may cause accidents

Racism in AI and Tech: Self-driving cars are blind to dark-skinned people, may cause accidents

Aug 21, 2023 - 19:30
 0  33
Racism in AI and Tech: Self-driving cars are blind to dark-skinned people, may cause accidents

Driverless cars have been making quite a splash in the news, for all bizarre reasons. However, a recent study suggests that these simple tech hiccups are just the tip of the iceberg. There might be troubling issues lurking within the technology that powers these autonomous vehicles.

A study was carried out by researchers from King’s College in London. They took a closer look at eight different AI-powered systems that are meant to spot pedestrians for driverless cars. These systems were trained using real-world data.

AI is blind to dark-skinned people
Shockingly, the study discovered that these AI programs had a much harder time identifying pedestrians with darker skin compared to those with lighter skin. In fact, the system struggled to recognize individuals with darker skin tones almost eight per cent more frequently than their lighter-skinned counterparts.

It’s truly a shocking statistic, and it highlights the very real and potentially life-threatening risks associated with biased AI systems.

How the study was carried out
The study started by meticulously annotating a total of 8,111 images with labels indicating things like gender, age, and skin tone. They marked 16,070 gender labels, 20,115 age labels, and 3,513 skin tone labels to create a comprehensive dataset.

From there, it was all about crunching the numbers. The researchers played with statistics and ultimately found a striking 7.52% gap in detection accuracy between individuals with lighter and darker skin tones.

The research pointed out that the risk for people with darker skin tones increased notably in scenarios with “low-contrast” or “low-brightness” conditions — basically, situations like nighttime.

 AI can’t see children as well
But the surprises didn’t stop there. In addition to the racial bias issue, the detectors had yet another concerning blind spot: children. Astonishingly, the results showed that children were a full twenty per cent less likely to be recognized by these detectors compared to adults.

It’s worth highlighting that the systems analyzed in the study weren’t directly from driverless car companies, as those details typically fall into the category of “proprietary information.” However, according to Lie Zhang, a co-author of the study and a computer science lecturer at King’s College, those companies’ models probably aren’t far off from what was studied.

This is particularly concerning considering that driverless vehicles are achieving significant regulatory milestones.

Zhang explained to New Scientist, “They won’t share their confidential information, so we don’t have insights into their specific models. However, we know that these models are usually based on existing open-source models. It’s quite certain that similar issues must be present in their models as well.”

The issue of machine bias is no secret, and as advanced AI technology becomes more deeply woven into our daily lives, the consequences of these biases are becoming more apparent. With actual lives at stake, waiting for regulations to catch up after preventable tragedies is not a path we should be comfortable with.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow