Early impressions of the cool 2009 early summer
I see trees of green...

Anchoring Bias (AKA Follow the Leader) in Community Tasting Notes

As more wineries focus marketing attention on new media such as blogs, social networking sites, and online bulletin boards I've begun to see some suggestions that the age of the all-powerful wine critic is ending.  I have my doubts as to that (of course, I also have doubts that the wine critic was ever all-powerful in the first place) but if a critic is going to be made obsolete, it is not by bloggers, or by bulletin boards, or by people posting wine tasting notes on Twitter.  It will be community tasting notes sites like CellarTracker.  With nearly 1,000,000 free wine reviews written by its users, CellarTracker is already a must-read for wine lovers and for wineries.  I check the listing of Tablas Creek reviews daily to see what people are saying, and to try to catch any trends that come up.  It's also a great tool to investigate what people are drinking on particular days (as I wrote about last Thanksgiving).

I've also been impressed with the degree to which CellarTracker has started to move from self-proclaimed wine geeks into the mainstream.  It boasts over 80,000 active users who have used its cellar management software to log over 13,000,000 bottles into the site.  I recently spent two days in Orange County, during which I hosted a tasting at the Wine Lab and a wine dinner at Sage on the Coast (read a nice writeup on the blog Mad Mary's Musings on Wine & Food).  At both events, I had attendees talk to me enthusiastically about their experiences on CellarTracker.  If you want independent confirmation, CellarTracker was recently judged the most valuable wine-specific social network in a comprehensive report published by the wine marketing company Vintank.

One of the selling points of community tasting notes sites is their integrity.  With so many users, the chance to purposefully influence scores is nearly nonexistent, and by summarizing reviews across different palates any individual taster's quirks are averaged out.  At least, that's the theory.  Up until now, there really haven't been any community tasting notes sites like CellarTracker except, well, CellarTracker, against which we can test the idea that there might be structural bias.

Recently, I've had growing suspicions that a different sort of bias creeps into community tasting notes sites.  I've termed this "follow the leader" bias, although my wife (who is a social psychologist) tells me that the correct psychology term is "anchoring".  Anchoring describes the tendency toward implicitly basing future judgments off of an initially suggested reference point.

With community tasting notes sites, the anchoring tendency is set by the probability that users check the notes of a wine they're about to open.  Once they do, particularly if the reviews note some flaw, the natural tendency is to expect to see (or at least look for the possibility of) the flaw that is mentioned.  I've noticed in the past that we have one reviewer on CellarTracker who regularly complains about the perception of alcohol on our white wines that are based on Roussanne.  It doesn't seem to matter that the wines are often quite low in alcohol by California standards (between 13.0% and 14.5%).  The taster appears to think that Roussanne, which does have a petrol character, just tastes high in alcohol.  And the next several reviews often mention an alcoholic character to the wine that was not present in the previous reviews.  Usually, someone dissents after a time and then the reviews gradually stop mentioning this character.

Still, when there is only one major community tasting notes site, it's always possible that trends in reviews can be explained by the wine going through a particular stage, and in fact that's one of the reasons I check CellarTracker so regularly: to see if any of our wines appear to be going into a closed stage, or if one may be coming out.  But, with the recent advent of a second viable community tasting notes site it's possible to cross-reference the reviews of a given wine.  The second site is VinCellar, a new free cellar management system created by the wine retailer and auction house Vinfolio. With about 23,000 tasting notes entered, its database is a small fraction of the size of CellarTracker's, but still large enough to make some interesting comparisons.

The wine which highlighted, for me, the possibility of an anchoring bias in community tasting notes sites was our 2007 Vermentino.  Vermentino typically makes crisp, citrusy wines relatively light in body with pronounced mineral signatures and low alcohols.  The 2007 vintage, though, was incredibly lush, and produced wines (both reds and whites) with very rich mouthfeel and unusually intense flavors.  For most grape varieties, this was a good thing, and I am convinced that our Esprit de Beaucastels from 2007 are the best red and white wines we've yet made.  For Vermentino, however, it produced an unusual wine which people tended to have strong reactions to.  Some loved it for its spice and saffron aromas and its rich mouthfeel; others found the aromas (which were spiced somewhat differently from most Vermentinos) offputting.  We got a handful of complaints from club members, and replaced the wine for anyone who wasn't happy with it, and also some kudos from members who thought it was the best Vermentino we'd ever done.  I just opened a bottle as I was finishing this post and thought it was delicious.  But look at the reviews of the wine on CellarTracker and VinCellar and it looks like we made two totally different wines.

Cellartracker reviews of 2007 Vermentino (average score: 83 points in 13 notes)
VinCellar reviews of 2007 Vermentino (average score:90 points in 7 notes)

This is why I cringe when I see a negative review posted on CellarTracker.  It takes a reviewer who is strong in his or her convictions to post a review that is knowingly contradictory to a string of very different reviews, and the suggestion of a flaw per force encourages other tasters to taste the wine looking for that flaw.  If I am right, it should be possible to shortcut this cycle by posting a review that contradicts a negative (or positive) review, which would free future reviewers to just use their own judgment.  Of course, to test my hypothesis, it would be necessary to subvert the integrity of these sites, which I am not suggesting.  But it's worth worrying, as the sites begin to have an impact on the mainstream wine market, that wineries, importers or other interested parties might try to "seed" the reviews with positive ones to encourage future reviewers to follow their lead.  It's probably also worth considering how someone might rebut or correct a negative review.  Other community review sites do this; for example yelp.com allows a business's owner to make a comment on a review if he or she wants.

Of course, there have been similar biases with the major reviewers for decades.  Often a writer like Robert Parker or the Wine Spectator (the two reviewers with the broadest reach in the market) can set a baseline for a wine.  But I think it's in many ways easier to understand that the review of a professional, however eminent, is just one person's take on the wine, and that you, as a consumer, are welcome to disagree, than it is to distance oneself from the cumulative reviews of other consumers.

In any case, it seems clear to me that it's worth looking in a bit more detail at how sites like CellarTracker can start to influence the perceptions of their users, and starting to reflect on what biases might be present in this new --  and increasingly powerful -- forum.

Comments