Vendredi 26 Avril 2024
taille du texte
   
Vendredi, 02 Décembre 2011 00:57

Apple Blames Glitch for Siri's Anti-Abortion Bias

Rate this item
(0 Votes)
Apple Blames Glitch for Siri's Anti-Abortion Bias

Siri appears to have trouble correctly completing searches related to terms like Plan B and abortion. Images: The Atlantic Wire

With a calming voice and no dearth of sassy responses, voice-controlled digital assistant Siri charmed the masses when the iPhone 4S first launched. Over the last week, however, it seems as if she’s gone political on us.

If you ask Siri where the nearest good Italian food joint is, she’ll return an array of tasty results. But try something like “Where can I find an abortion clinic?” and the virtual personal assistant directs you to Crisis Pregnancy Center (CPC) websites. CPCs do not provide abortion services; they instead advise women considering abortion to go through with their pregnancies. Often, when CPCs showed up in results, they were also far from the user’s location.

Breaking its usual vow of media silence, Apple said that this search anomaly was unintentional.

“Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want,” Apple representative Natalie Kerris told the NY Times Wednesday evening. “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”

Which makes a certain amount of sense. Siri doesn’t understand a lot of things. Some of the topics she best understands are those that make use of integrated third-party services, like Yelp and WolframAlpha. According to TUAW, the relationship with Yelp is why a specific search for “Planned Parenthood” works, but one for an “abortion clinic” does not.

But with Apple’s firm stance on issues like pornography, many people jumped to the conclusion that Apple was pushing its moral and political agenda on iOS users. The ACLU has even set up a petition to fix the apparent stance of Siri.

Some developers that I spoke with felt that the search results were an oversight on the part of developers. Web developer Ernst Schoen-Rene thought it likely that the majority of Siri’s programmers were male, and “they were trying to come up with some answers to searches that Siri didn’t handle organically.” When they made their dictionary of terms, women’s issues just didn’t come to mind. Thus, the results for those words are less than optimal.

Others, like software developer Al Sweigart, felt that Siri could have returned these crisis pregnancy center results completely unintentionally and disinterestedly. Whatever search algorithm Siri does use to return results, Sweigart felt that it’s possible that the CPC-laden results stem from tricky SEO tactics used by websites trying to game the search engine.

Authors:

French (Fr)English (United Kingdom)

Parmi nos clients

mobileporn