Technology

Is Google racist? Fresh fire over search results

If you searched for "three white teenagers" on Google Images earlier this month, the result spat up shiny, happy people in droves – an R.E.M. ballad in JPG format. The images, mostly stock photos, displayed young Caucasian men and women laughing, holding sports equipment or caught whimsically mid-selfie.

If you searched for "three black teenagers", the algorithm offered an array of mugshots.

The disparity when searching 'three black teenagers' and 'three white teenagers' in Google images.
The disparity when searching 'three black teenagers' and 'three white teenagers' in Google images. Photo: Google Images

A soon-to-graduate senior at Clover Hill High School in Midlothian, Virginia in the US, named Kabir Alli, recorded the disparity – and, as any enterprising 18-year-old would, posted the video to Twitter. The result was swift and massive virality, as his video was shared more than 65,000 times. (Similar observations had been made before, by YouTube videographers and others, but had not quite so deeply lodged in the internet's consciousness.)

Before he made the video, friends told Alli about what the Google search would pull up. But the teenager says watching it happen in person was still a surprise. "I didn't think it would actually be true," Alli said in an interview with USA Today. "When I saw the results I was nothing short of shocked."

Google responded that its search algorithm mirrors the availability and frequency of online content. "This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query," the company said in a statement to the Huffington Post UK. "These results don't reflect Google's own opinions or beliefs – as a company, we strongly value a diversity of perspectives, ideas and cultures."

Algorithms like the ones that power Google's search engine tend to be esoteric or trade secrets or both, giving the software an air of mystery. Considering algorithms are, after all, lines of code, it is understandable why we might want to perceive them as unerring, impartial decision makers. That is a tempting view – but it is also, experts say, incorrect.

Advertisement

As David Oppenheimer, a University of California, Berkeley law professor said about algorithms to the New York Times in 2015: "Even if they are not designed with the intent of discriminating against those groups, if they reproduce social preferences even in a completely rational way, they also reproduce those forms of discrimination."

Google Image searches for "beautiful dreadlocks" yield mostly dreadlocked white people, as Buzzfeed UK pointed out in April, with some critics citing this Eurocentric bent as an example of racism. Alli, for his part, told USA Today that he does not believe Google is racist, saying that "black males making poor choices also plays a major role". But others disagree that Google should be so readily absolved. Safiya Umoja Noble, an African American studies professor at the University of California, Los Angeles argued to USA Today that Google has a responsibility to eliminate racial bias from its algorithm.

"It consistently issues a statement that it's not responsible for the output of its algorithm," Noble said. "And yet we have to ask ourselves: If Google is not responsible for its algorithm, who is?"

Even if human programmers do not reflect widespread discrimination, intentionally or not, they can introduce bias through omission. Google also came under fire in July 2015 when its photo app autonomously labelled a pair of black friends as animals. As The Washington Post noted, the engineer in charge of the program believed the underlying program was fine, but the data used during training of the algorithm was "faulty" – suggesting Google, perhaps, neglected to run a sufficient number of minority portraits through the AI.

Apparent algorithmic bias can be based on political views or gender as well. Facebook was accused of burying conservative news in its trending topics section on users' homepages on the social network, though many of the allegations focused on human curators; Facebook later met with conservative leaders, and ended relying on an algorithm that monitors media websites. And a blurb for a job with a "$US200K+" salary, which appeared in Google's ad program, was almost six times more likely to be shown to a man than a woman, a Carnegie Mellon University study found; it was unclear if advertisers wanted to target men, the scientists concluded, or if Google's software indicated men were more likely to click on the ad.

That is not to say that all algorithmic bias must be bad. Roboticists are creating artificially intelligent robots that stereotype to make quick, crucial decisions. Last year, Georgia Tech researcher Alan Wagner built an experimental machine to observe how a robot might distinguish between civilians and emergency personnel in a disaster.

Based on features like uniforms, for instance, Wagner's program was able to determine if someone was a police officer. But it also concluded that all firefighters must have beards – simply because the simulated firemen all happened have facial hair during the experiment. Wagner now advocates that young artificial intelligences need "perceptual diversity", as he described to the website Inverse – for instance, showing a broad swath of humanity to a program being trained to recognise faces.

The Google Image results for three white or black teenagers now reflect that the algorithm has learned the debate over bias is popular, showing photo montages linked to news stories like the one you are currently reading.

But racial disparities among Google's treatment of teens are still easy to find: a Google video result for "three white teenagers" brings up YouTube news clips about the image result snafu. The same search, with "Asian" substituted for "white", yields dozens of links to pornography.

Washington Post

Is Google at fault here, or society? Let us know in the comments below.

0 comments