So, as more than a few people handily explained (and more than a few douchebags mansplained nastily, but we can ignore them, right?) Siri’s database for searching for local businesses is Yelp. It uses the Yelp API, which is sort of clear when you search for things like donuts:
It says “reviews from Yelp.” It doesn’t actually say search results from Yelp. But that’s perhaps a nitpick. But it’s not necessarily so clear that if you’re not searching for restaurants or recreation information that it would use Yelp, which is most in its wheelhouse on matters of fun and food.
And it’s clear that Yelp is definitely not a leading contender as an information source for other issues, especially health matters, even though some people are starting to leave reviews for providers there. And it’s also a largely user-generated information source, so if people aren’t listing their reproductive health clinic there, it won’t be there. (And most clinics themselves have more than enough to do than to make sure that they’re listed on every listing website that pops up, even the ones that “catch on” like Yelp.) And I don’t think most people would be inclined to go write a review after their abortion, even if they’re a choice activist. (And especially not on a website that enables other users to comment or vote on reviews. Imagine the abuse.)
What is also clear is that Yelp, accepting user-generated content as it does, can be gamed. Yes, they curate user submissions, but clearly not carefully, because anti-choice “pregnancy centers” which never have a problem masquerading as health providers or in putting themselves into abortion-related categories in listings (prompting controversy as long ago as the 80s with listings in printed Yellow Pages directories) have been found listed on Yelp in a way that makes it appear that they are, in fact, abortion providers. And that information is being pushed to Siri, as we’ve seen in screenshots of searches, especially those made by users in the Washington DC area .
And here’s why this issue blew up as it did — and is confusing to so many people, most of whom don’t have an iPhone 4S and have perhaps not seen how Siri is being promoted by Apple. While they’re not really hiding that Yelp is a source for local business searches, they’re not exactly promoting it, either. They’re also surely not telling us that they’re the only source for local business information.
And I think that’s quite intentional, as Siri is being marketed as an innovation, and Yelp isn’t new or innovative.
Yelp’s iPhone app has been around for three years. If someone wanted to search Yelp for anything, they already know how to do that, and don’t need a brand new iPhone 4S to do it. In the television ads that promote Siri as a major selling point for the new phone, Yelp isn’t mentioned as the source of any of the information that we’re shown Siri providing. And when we see a (simulated) screenshot of search results, I must note, that little “reviews from Yelp” logo is nowhere to be seen.
And people know that as a smartphone, an iPhone is a web-connected device. And it’s reasonable to believe — and we’re not dissuaded by that belief by Apple’s marketing of Siri — that a web-connected device that’s promised to give customers this broad range of information is sourcing that information from the web, not a single, limited source. When Siri offers — when it does actually offer — to search the web for users when it can’t provide an answer itself, that implication is only deepened. It’s reasonable to assume that unless otherwise stated, a device with access to the whole web is going to make use of it, not a single, tiny corner.
And remember this?
This result was explained as “well, it’s not listed on Yelp, so Siri can’t find it.” And that’s true if you search for the basic category of “abortion clinics” in Pittsburgh on Yelp; the abortion providers in this area are not properly categorized on the site. And yet Allegheny Reproductive Health Center is listed on Yelp. And when I search for it on Yelp’s iPhone app:
Not only is Yelp able to auto complete the name, it has two listings, because one includes the suite number and the other doesn’t:
So either Siri failed during my first test (someone suggested that the “Sorry I couldn’t do it” response happens when the database is unavailable or network traffic is too high) or there’s some reason why Siri can’t find or won’t disclose this information which very clearly exists on the database it queries. So just in case Siri was having database access problems in my first test, I just tried again as I’m typing this, and this was the result:
It still couldn’t find ARHC, which we know is definitely on Yelp, even when once again given the full, proper name and the street it’s on. And the results it offered? Neither of those locations are “near” Highland Avenue at all, and are a gym and an optometrist, which are health related, but not clinics, and certainly not reproductive health care centers.
Yesterday, after a Change.org petition was launched and bad press was mounting from NARAL, Ms. Magazine, Colorlines and spreading into mainstream media including the New York Times, Apple released a statement that what has been experienced by users across the country is the result of a “glitch” and reminding the world that Siri is a “beta” product.
If “glitch” is corporate speak for “we picked a limited, user-curated, user-generated database because it had a workable API, and that was a mistake, and we’re now aware that we need to augment or replace it because it’s not providing accurate, complete information to our customers even when it has it to provide” then this is, yes, a “glitch.” But it remains to be seen if that’s what Apple means at all.
And it’s a rather facile explanation, frankly, for failures that are not just a reflection of a poorly populated database, but several other aspects of how Siri has engaged with some sensitive and important issues.
Until the “glitch” is addressed, customers are apparently just supposed to live with an assistant who we’re told “understands what you say, knows what you mean, and even talks back” yet mocks rape, stonewalls us or worse, sends us into the clutches of anti-choicers when we need to terminate a pregnancy and confuses abortion providers with domestic violence shelters which it says doesn’t exist.
It seems that Apple had better get their collective glitch-solving noses to the grindstone PDQ.