So does SIRI have a moral agenda?

Interesting blog post by the American Civil Liberties Union.

Siri can help you secure movie tickets, plan your schedule, and order Chinese food, but when it comes to reproductive health care and services, Siri is clueless.

According to numerous news sources, when asked to find an abortion clinic Siri either draws a blank, or worse refers women to pregnancy crisis centers. As we’ve blogged about in the past, pregnancy crisis centers, which often bill themselves as resources for abortion care, do not provide or refer for abortion and are notorious for providing false and misleading information about abortion. Further, if you’d like to avoid getting pregnant, Siri isn’t much use either. When asked where one can find birth control, apparently Siri comes up blank.

The ACLU put Siri to the test in our Washington D.C. office. When a staffer told Siri she needed an abortion, the iPhone assistant referred her to First Choice Women’s Abortion Info and Pregnancy Center and Human Life Pregnancy-Abortion Information Center. Both are pregnancy crisis centers that do not provide abortion services, and the second center is located miles and miles away in Pennsylvania.

It’s not just that Siri is squeamish about sex. The National Post reports that if you ask Siri where you can have sex, or where to get a blow job, “she” can refer you to a local escort service.

Although it isn’t clear that Apple is intentionally trying to promote an anti-choice agenda, it is distressing that Siri can point you to Viagra, but not the Pill, or help you find an escort, but not an abortion clinic.

Apple’s response, according to CNET:

Apple … is still working out the kinks in the beta service and the problem should be fixed soon.

“Our customers want to use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want,” Apple spokesman Tom Neumayr said. “These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.”

Although I’m as partial to conspiracy theories as the next mug, somehow I don’t think SIRI’s apparent moral censoriousness is a feature rather than a bug. But it does remind one of the dangers of subcontracting one’s moral judgements to software — as parents, schools and libraries do when they use filtering systems created by software companies whose ideological or moral stances are obscure, to say the least.