« The Thompson Men at the Goat Farm | Main | Macari Vineyards 2008 "Katherine's Field" Sauvignon Blanc »

April 28, 2009

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8341d0dbb53ef01156f633d18970c

Listed below are links to weblogs that reference Hudson Valley Wine & Grape Association's 5th Annual Hudson Valley Wine Competition:

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

"And boy can judges differ in how they score wines. Amazing just how much."

You bet, Lenn.

With such wide swings in scores, averaging to award medals is problematic, which is why consensus judging seems to work best. Still, the wide diverging scores cause me to often wonder over the value to consumers concerning wine competitions.

I just finished a wine competition where a few wines had noticeable flaws like serious mercaptan, oxidation, and v.a. yet, some judges gave them Gold Medal scores.

Thomas,

Agreed, and this ended up being consensus judging -- at least when it came to selecting the best white and red wines.

I noticed a couple flaws in a couple wines on Saturday, but we weren't supposed to discuss the wines with our fellow judges, so I don't know if anyone else noted them.

What Thomas said is distressingly true, and it ultimately comes down to the fact that quality (as defined by pleasure giving, in the case of wine) is not only undefinable but non transferable. Everyone should read Zen and the Art of Motorcycle Maintenance on this subject! But as Thomas comments, what do you do when scores on a single wine by a panel of experts are all over the place? Who is right? I have certainly seen mercaptan-laden wines get gold medal scores from 'experts'. Is it our job to tell them the 'truth'?

Peter: This was my first judging gig, but I tried to approach it that way. My tasting sheets, which are supposedly going to be shared with the winemakers, had not only AWS scores across the board, but notes at times, including a little "VA?" note and "poor balance"

Is that what I was supposed to do? I'm not sure, honestly. I just did what I thought made the most sense.

It was interesting. One of the judges near me had very similar scores to my own. Another though, was very different from me.

It's definitely easy to see the inherent flaws in these sorts of competitions.

Lenn,

My notes always contain what I perceive as technical flaws, if I perceive them. I could be wrong on some of them and one way to prevent wrong analysis from happening would be for competitions to first have all wines submitted to lab evaluation.

That would mean establishing parameters of technical acceptability. Such a move would address a level of subjective analysis that itself is flawed--but I don't envision it happening any time soon!

Lenn,

It was a pleasure having you judge the competition. I will bring the sheets with me Friday. The wineries will see the scores but not the names of the judges who gave them the score.

Curious to hear more about the wineries that attended. I had no idea this was even going on (despite the fact that I'm on Terrapin's mailing list) - just goes to show how completely out of the loop I am!

The comments to this entry are closed.

Long Island Restaurant Week

The Cork Reports are protected under a...

  • Creative Commons License

Empire State Cellars


A Taste of Summer


Experience Finger Lakes

NYCR Advertisers




Become a NYCR Sponsor