In the old days there was a piece of advice people followed when looking for a place to eat in an unfamiliar area: go to restaurants with crowded parking lots.
The idea was that crowded places must be popular and restaurants don’t become popular accidentally. Loosely, this is an example of “the wisdom of the crowd”, or ”the process of taking into account the collective opinion of a group of individuals rather than a single expert to answer a question.”
It was a nice trick with a pretty decent success rate. But things are different these days.
Consider how our opinion of the restaurant might have changed if we found out that some cars in that parking lot were placed there by the restaurant owner in an attempt to falsely inflate the establishment’s perceived popularity? Seems ridiculous and impractical, and it was … pre-Yelp. But in the Yelp era we currently live in, it’s much easier to manipulate consumer opinions via online reviews.
Consider this: MarketWatch recently reported that 20-25% of Yelp reviews were likely written by paid shills, seeking to dupe people into believing the restaurant had a higher rating than it had organically earned.
So what, a little harmless gamesmanship on behalf of the restauranteur, right? If you’re not cheating, you’re not trying, the saying goes. Besides, most people will gloss right over most reviews and dig deeper before forming an opinion, right?
Not according to recent research. In a recent study published in Science magazine, the effect rears its manipulative head:
Our society is increasingly relying on the digitized, aggregated opinions of others to make decisions. We therefore designed and analyzed a large-scale randomized experiment on a social news aggregation Web site to investigate whether knowledge of such aggregates distorts decision-making. Prior ratings created significant bias in individual rating behavior …
Positive social influence increased the likelihood of positive ratings by 32% and created accumulating positive herding that increased final ratings by 25% on average.
The takeaway is simple: we’re more likely to like things that people already like, even when we have no idea who they are. Even when there is no objective reason why we should believe them.
Or as Sinan Aral, one of the researchers who conducted the study put it in an interview on IEEE Spectrum: “If something is voted up, we say, ‘Oh, it must be better than I thought it was,’ I’ll go ahead and up-vote that as well.”
Aggregated consumer opinion data – the modernized incarnation of the wisdom of the crowd – will increasingly shape the context of our online experience and will subsequently effect the real-world decisions we make.
What’s not to like? It depends on who you ask.