Every once and a while the debate about the 100-point scale swirls around the wine blogosphere (and now wine twitteratti). The latest flash-point is wine critic and recently turned blogger James Suckling who posted a video on his site yesterday detailing how he evaluates wine. Different than a lot of wine bloggers, but consistent with the norms of wine critics like Robert Parker, Suckling tastes blind in large batches. This approach is certainly an efficient way to taste through dozens or even hundreds of wines in a morning but we don’t get any sense for how the wine will hold up over a few hours or after decanting. This is why I literally live with wines I’m reviewing over 2 and sometimes 3 days. But this system would slow a writer down significantly if you are tasting 50 wines at a go so we accept the “moment in time” approach used by most wine critics.
And that brings me to the dirty little secret about the 100-point scale; the myth of precision. Mr. Suckling reviews his approach to applying scores in his video which is very similar to my process. But each area we break down still has far too much wiggle room and can easily make an 89 wine a 91, and vice versa. The same thing applies in my grading papers when I teach marketing. This objectiveness is at the heart of why I was proposing a new rating scale for wine bloggers a while back.
The only way to somewhat smooth the ratings variability is to taste a wine over the course of time in the same tasting conditions and normalize scores or taste a boatload of wine so the reader can get a sense for the reviewers’ palate. The reason I use the 100-point scale is because readers ask for it and, like it or not, it will be with us for a long time to come. But I don’t think it’s a precise instrument.
What do you think?
via James Suckling
I think some kind of scale is useful and although there are strong advocates of the 20 point scale such as Jancis Robinson, whether it’s 20 or 100 points doesn’t really bother me. Consumers seem to like numbers and I find them useful when helping compare wines.
As mentioned already a 100 point scale does suggest a level of accuracy that is plainly just not possible. Although I score wines at 87 or 88 the difference is probably down to my mood and what I have tasted previously – and even the weather and time of year I suppose.
However, whatever the scale is, it needs a benchmark.
I judge for two international wine competitions. In one a wine judged a fine example of excellent quality can score anything between 80 and 89.9 to be awarded a silver medal. In the other competition the wine has to score 90 to 94.9 to get a silver medal. It’s the silver medals that are equal in the consumers’ minds but not the scores.
I’m all for some kind of scale. All who use it need to ensure some kind of consistency across it something I feel may be impossible to achieve.
Colin: I like your idea of a benchmark but how do you calibrate your palate before a large tasting and remain as consistent as possible with your scores?
Ugh…I watched this and thought it would be some interesting insight into Suckling’s process. Immediately I was confused by a 100 point scale that has components where he grades in half point increments. That’s a 200 point scale. And 13.5 for color, what? I was with you way back when for the standardized system, but I’m getting more skeptical as time goes on. How do I accurately compare my experience with a 15 year old Spanish Albarinio to a new release Bordeaux. My judgment criteria aren’t even the same. I can’t dispel that into a 100 point scale or a 5 star system any more accurately. Did I judge the wine for its character and secondary notes, its potential for greatness 10 years from now, its capacity to cause me to ponder my existence?
What Suckling and others do has a place, specifically to weed poorly made wines from the market, and maybe hold up examples of the best wines of their generation, the 99-100 point wines. But as for the vast sea of 89-92 point wines out there, it feels pretty meaningless. I had to laugh the other day when Vaynerchuck reviewed a Beaujolais, and said “this is the best Beaujolais I have ever had, 92 points”. That’s a lot of headroom.
One of the other issues is all wines are not created equal so you really can’t compare Gary’s 92 point Beaujolais with my 92 point Pinot posted the other day. I bet in a blind shootout that Pinot would kick the Beaujolais’ butt. Have you ever had a 100 point rose? I don’t think so but that doesn’t mean the wine is not the best example of it’s type.
I run across the same thing when I grade papers in my teaching. If a student ticks all the boxes on the paper they get a B+. The rest of the way are style points for the most part; completely subjective. The same thing applies to wine reviews, unfortunately, and I’m not sure if there is a solution.
Thanks for your comment.