by Tim Atkin

In search of wine expertise

Last week I was an invigilator at the Master of Wine exam in London, supervising a tasting paper that I’d helped to set. Before the roomful of candidates sweated, slurped and spat their way through twelve blind samples, the three examiners assessed the wines, too. Were they at the right temperature? Had we eliminated any corked or oxidised bottles? Were the wines showing the typicity we’d chosen them for?

Everyone knows that the Master of Wine exam is famously difficult – it’s partly the stress of sitting seven papers in four days, but also the level of expertise required to pass – but it’s a fair test. This is my first year as an examiner and I was impressed by the rigorous approach taken to the setting, sitting and marking of the paper. As ever, the best candidates will pass. I’ve had a look at a couple of answers already and it’s remarkable how good some people are at identifying the different origins, styles and quality levels of a set of wines.

Masters of Wine aren’t the only good tasters in the world – and even within the Institute there are varying levels of expertise – but everyone who can use the magic letters after their name has sat where those candidates sat in London, Sydney and the Napa Valley last week. It’s a professional tasting qualification and it guarantees a certain level of expertise.

I thought about those MW candidates when I opened my copy of The Observer at the weekend, complete with a tendentious piece about wine experts prompted by the findings of an American producer whose wines I’ve never heard of. The article prompted a flurry of on-line responses, most of them agreeing with the premise that tasting expertise is questionable at best, or as two of the 346 comments put it, “a crock of shit” peddled by “pretentious nobheads”.

Features bashing wine experts with their “flowery language” seem to appeal for two main reasons: first, they enable some members of the British wine-consuming public to indulge the muddle-headed notion that cheap plonk is “often superior” to more expensive stuff (that deal-driven, lowest-common-denominator mentality that has done so much damage to average wine quality in the UK) and, second, to give vent to a deep-seated insecurity about their own senses, invariably expressed as reverse snobbery. “I know what I like,” etc.

Do the findings themselves have any validity? Robert Hodgson of Fieldbrook Winery in Humboldt County in northern California has a background in statistics, apparently, and first undertook his experiment at the California State Fair wine competition in 2005, giving judges the same wine on a number of different occasions in a blind flight and assessing their accuracy as tasters. He’s been doing the same thing on a regular basis since then and as recently as earlier this month in Sacramento.

His conclusions, which have “stunned the wine industry”, according to The Observer’s breathless reporting, are that some wine judges are more consistent and better at tasting than others. Big deal. There’s a huge difference between this pretty unremarkable piece of “news” and the paper’s claim that “over the years, Hodgson has shown again and again that even trained, professional palates are terrible at judging wine”. But the claim makes for a provocative headline.

As far as I know, Hodgson’s analysis has so far been restricted to California. We are not told how “expert” the experts who performed badly were. Are we talking a sommelier with a couple of weeks’ work experience in a steakhouse? Or an experienced show judge who assesses wines professionally for a living? The claim that they read like a “who’s who of the American wine industry” sounds questionable. In Sacramento?

Hodgson’s conclusion is that medals from wine competitions across California are effectively distributed “at random”. This raises a few questions. Did all the shows perform equally badly? Did the methodology differ between shows, and did that promote greater consistency? And, last, why enter wines from your own winery if the results are no better than a crap-shoot?

Good wine competitions are much more consistent than bad ones. Just look at the wines that perform well at the International Wine Challenge (an event of which I am a co-chairman, to declare an interest) from year to year. It’s noteworthy that the best producers reguarly come out on top.

The best tasters are perfectly capable of judging 100 or more wines in a day and of tasting them with insight and knowledge. Wine tasting is not a science (or “junk science”, as The Observer would have it) because it is a subjective exercise. That’s why the best competitions use teams of tasters to iron out inconsistencies and personal preferences and, in the case of the IWC, taste the same wine as many as seven times at different stages of the competition.

Wine tasting is a skill, however. Some people have greater or lesser natural abilities, but anyone can learn how to do to it to a professional level of competence. The more you practice, the better you get. To say that such a skill doesn’t exist is nonsense. Just ask those MW candidates.

Originally published in Off Licence News


Leave a Reply