What to look for: Meaning, Consistency and Value
POINTS SYSTEMS
The reviewers we respect the most down under are #RealReviews (Huon Hooke / Bob Campbell) and Messrs Mattinson, Walsh and Bennie at the Winefront. The diagram below is Huon Hooke’s, and it’s pretty much the same system we use at Best Wines Under $20.
The Winefront uses a similar system, but the stars are not aligned the same way
97 – 100 | ****** | Exceptional | |
94 – 96 | ***** | Gold | Outstanding |
91 – 93 | **** | Silver | Excellent |
88 – 90 | *** | Bronze | Good |
85 – 87 | ** | Average |
The American, English and European systems tend to go down to 80 points, as the Wine Spectator’s 100 point scale shows
- 95-100 Classic: a great wine
- 90-94 Outstanding: a wine of superior character and style
- 85-89 Very good: a wine with special qualities
- 80-84 Good: a solid, well-made wine
MEANING
That means that you have to recalibrate the point scores from overseas reviewers, since they’re generally lower than ours. However, at the top end you’ll rarely see a 100 point score from a reviewer down under, while Robert Parker has scored some 500 wines at 100 points in recent years. Parker even claims that not giving a 100 points to great wines is irresponsible.
Knowing a reviewers scoring system is one thing, knowing how it’s applied is another. Here it helps to know the preferences of the reviewers, and their likes and dislikes. Parker is a case in point: he loves rich, ripe, alcoholic reds regardless whether they come from McLaren Vale or the Medoc. Huon Hooke doesn’t mind some grapefruit in his Chardonnays, I do. I don’t like huge reds, and I’m not that fond of Shiraz. I try to compensate for this bias in my reviews, or remind readers of it,
CONSISTENCY
I find the RealReview and the Winefront are the most consistent. By that I mean that the reviewers know what they’re doing, that their assessments and point scores are accurate, and that they focus on the wine in front of them. Let me explain the last comment: big name reviewers like Parker and Halliday front large commercial operations that make a lot of money. The money comes from wine companies and retailers paying for the right to republish their reviews and their point scores.
That explains the inflated scores from those reviewers, whose real constituents are no longer wine drinkers but wine companies who like high scores because they help to shift their wines. Not sure about Parker, but Halliday’s scores are generally 3-5 points higher than the RealReview’s and the Winefront’s.
We have a similar phenomenon at the other end of the scale: The Key Report. The scores Tony gives to most of the ALDI wines are incomprehensible. I get the same samples from ALDI, and I scored several of these in the low nineties, but the vast majority are simply good value quaffers in the 85 to 89 point range. I have no idea how or why Tony comes up with his inflated scores.
VALUE
We’re the only review site with a sharp focus on the quality / price ratio of wines. So here’s our value scale at BWU$20:
STREET PRICE | BWU$20 SCORE |
$6 – $10 | 87 – 90 |
$11 – $15 | 90 – 93 |
$16 – $20 | 94 – 96 |
$21 – $25 | 96 – 98 |
$26 – $30 | 98 – 100 |
The obvious bargains are: $10 wines that score 90 points, $15 wines that score 93, $20 wines that score 96 and so on. We could add $7 wines that score 88 points, $12 wines that score 91, $17 wines that score 94, and $22 wines that score 96.
Hope that helps
Kim