Real Writers. Real Opinions. No Boundaries.

Google Removes Gorillas From Image Labelling Tech Instead of Fixing Racist Algorithm

In 2015, a software engineer named Jack Alciné discovered that when he had photos of himself with black friends imported into Google photos, their recognition algorithms classified these friends as “gorillas”. Google speedily apologized, claiming that they were “appalled” at the mistake, and promised that they would fix the issue as soon as possible. However, now three years down the line, Google has still not fixed the recognition issue. What they have decided to do is completely prevent their image recognition algorithm from  identifying gorillas at all!

If you are wondering how this little alteration was discovered, it turns out that Wired performed a number of tests on Google Photo’s algorithm. They uploaded tens of thousands of pictures of a ton of different kinds of primates. Baboons, gibbons, and marmosets were all correctly identified, but surprise surprise gorillas and chimps were a mystery to the algorithm. Additionally, Wired discovered that when they searched for “black man” or “black woman”, they only received pictures of people in black and white, all sorted along gender lines, but not by race.

The jig was up, and a spokesperson for Google admitted to Wired that these search terms were indeed blocked by Google since the mishap in 2015. The accident, and the subsequent failure to fix the issue, which is obviously proving harder for Google to do than one would think, serves as a good reminder: There is still a great deal that we don’t understand about AI, not even the supposed forerunner’s in the technology, Google. Although it is not clear whether the 2015 problem went unfixed because the solution eluded Google, or because they simply did not want to dedicate the time, effort, and resources to the project. But it is clear that these complex algorithms are far from perfect, and in fact have a long way to go.

[Via The Verge]
You might also like