Sake doesn't need numeric scores!
Back in 2013, I saw the writing on the wall, that it was coming, and I ranted that Sake Don't Need No Stinkin' Scores! At that time, I'd seen a few magazines and online Sake reviews that provided numeric scores for Sake, usually on a 100 point scale. However, it was relatively uncommon and seemed to have little to no impact. Then, this past summer, Robert Parker's Wine Advocate published 78 Sake reviews, each with a score, leading to my new Rant: Sake Still Don't Need No Stinkin' Scores!
I wanted to update my Rant, to see what had occurred since the release of the Wine Advocate's scores. Had it lead to increased interest in Sake, or had my own negative scenarios occurred?
Initially, the Sake article was largely ignored on the Wine Advocate forum. No one mentioned the article and it certainly didn't seem to increase any interest in Sake. Over at the Wine Berserkers forum, there was some discussion of the Sake article & reviews, but a fair amount centered on the qualifications of Liwen Hao. It didn't seem to be increasing anyone's interest in Sake. At least initially, the Sake ratings didn't seem to be raising interest in Sake.
A couple weeks after the release of the Sake article and reviews, a potential scandal took front and center as a Japanese exporter, The Taste Of Sake, offered all 78 Sakes that had been reviewed, and only those Sakes. The timing of the website, essentially on the day the reviews were released, seemed more than coincidental and questions were raised about the integrity of the Sake review process. This scandal was then discussed on the Wine Advocate and Wine Berserkers forums, certainly not helping the reputation of Sake. The Wine Advocate started an investigation of the matter, finally posting the results on October 16. The Taste of Sake shut down their website and operations during the investigation period.
Lisa Perrotti-Brown of The Wine Advocate stated: "It is believed that when Liwen Hao asked for extra technical information, the information was misused. Upon our legal counsel’s investigation, Millesimes acknowledged that the failure to maintain the confidentiality of this information was a breakdown in the process, and may have led to the dissemination of the list of sakes prior to posting the tasting notes and scores." She also stated a couple actions that would be taken in future tastings to prevent a repeat of this type of incident. Though there remain a few unanswered questions, the discussion of this matter will likely die down, though when the next Sake reviews are released, there will be added scrutiny as to anyone selling those specific Sakes.
It is also important to note that the importer, The Taste of Sake, quickly raised the prices of the Sake that he was selling. Which is exactly the type of price gouging I was concerned about in my last rant, pricing Sake out of reach of the average consumer. There is little reason that other importers, distributors and retailers won't also raise the prices on highly rated Sakes, keeping them out of the reach of the average consumer. And that won't help raise the popularity of Sake or increase consumption.
Since the release of the Wine Advocate Sake reviews, several articles in newspapers and online publications have generally made brief reference to them. One Sake brewery, Urakasumi Sake Brewery, also highlighted their score on their website, noting "Yamadanishiki Junmai Daiginjo Urakasumi Koshu was selected as one of the great sake and rated 91 points."
The Daily Mail, in the UK, published an article "Japan sake pours overseas as local market dries up" with only a quick mention of the Wine Advocate Sake ratings. The Telegraph, also in UK, published the article After wine, is sake the new drinkable investment?, noting the Sake ratings and discussing their investment potential. Such Sakes aren't for the average consumer, but more for the monied investor, seeing something he might be able to sell for a profit in the future. Jancis Robinson also penned a recent Sake article, mentioning the Sake ratings and alleged scandal.
Recently in Decanter, Anthony Rose penned an article, Sake: A Beginner's Guide & Top Recommendations. There is a brief article on some Sake basics and then 9 reviews, each Sake with scores, ranging from 89-94, on a 100 point system. Thus, they are using a rating system similar to the Wine Advocate. Anthony last wrote a Sake article in Decanter back in November 2008, and there were no Sake ratings then. Could this also be the future of Decanter coverage for Sake? I should note that the article has not received any online comments, potentially indicative of a lack of interest.
The Wine Advocate won't be reviewing any more Sakes this year, though plans to do so in 2017, allegedly covering a "broader range of styles and quality levels." It seems clear that their prior article and reviews didn't significantly raise the profile of Sake, especially with average consumers. Even the Wine Advocate forum posters didn't seem concerned about Sake until a scandal was alleged. The ratings did lead to higher prices for the rated Sakes which isn't good news for the average consumer. I have seen nothing over the last couple months which would change my opinions from my prior Rant.
Let me repeat: Sake doesn't need numeric scores!
For Over 18 Years, and over 5500 articles, I've Been Sharing My Passion for Food, Wine, Saké & Spirits. Come Join Me & Satisfy Your Hunger & Thirst.
Showing posts with label Wine Review Ratings. Show all posts
Showing posts with label Wine Review Ratings. Show all posts
Wednesday, November 2, 2016
Monday, September 5, 2016
Rant: Sake Still Don't Need No Stinkin' Scores!
"Great sake is like a poem. When tasting beautiful sake, you might sing... Mist in the valley... Spring in the mountains... or Breeze in the forest..."
--Liwen Hao in Wine Advocate, #226
It was inevitable. Back in 2013, I saw the writing on the wall, that it was coming, and I ranted that Sake Don't Need No Stinkin' Scores! At that time, I'd seen a few magazines and online Sake reviews that provided numeric scores for Sake, usually on a 100 point scale. However, it was relatively uncommon and seemed to have little to no impact. You didn't see those scores at local wine shops and I wasn't hearing anyone talking about them.
The May 2013 issue of Wine Spectator contained three articles on Sake, as well as tasting notes for over 50 Sakes. Rather than evaluate the Sakes by their usual 100 point system, Wine Spectator listed the Sakes as Good, Very Good or Outstanding. In comments on a post on the Colorado Wine Press, Thomas Matthews, the Executive Editor of Wine Spectator, mentioned, "We have much less experience with sake, and felt that broader categories would be more appropriate to express our opinions on their quality. However, I could easily see a critic with deeper experience in sake using the 100-point scale, and perhaps if we taste extensively enough, one day we will too."
It was easy to predict that the day would come when the Wine Spectator, or another major wine publication, would use the 100 point system to rate Sake. That day arrived last week when Robert Parker's Wine Advocate published reviews, with numeric scores, of 78 Sakes.
Technically, this isn't the first time that the Wine Advocate has provided scored reviews for Sake. Back in October 1998 (issue #199), Robert Parker wrote an article, The Sumo Taste (A Beginner's Guide to Understanding Sake), and reviewed 48 Sakes, the majority being Daiginjo, scoring them from 86 to 91. It has taken the Wine Advocate 18 years to start scoring Sake again.
For the new Sake reviews, famed Sake critic Haruo Matsuzaki, who John Gautner referred to as a "sake critic extraordinaire" and "the most respected critic in the industry, especially among the brewers themselves," first selected a group of top Junmai Sakes from an initial pool of 800. Then, Liwen Hao, the Asian Wine Reviewer for Wine Advocate, selected 78 Sakes, all Junmai Ginjo and Junmai Daiginjo, from the group chosen by Matuszaki, providing descriptive reviews and a numeric score.
The Wine Advocate announced the hiring of Liwen Hao back in December 2015, noting he would review Asian wine and other alcoholic beverages, as well as support the new Chinese version of RobertParker.com. Hao was born in Xi’An, raised in Shanghai, and first began working in the wine industry in 2004, taking a job with the ASC, the biggest wine importer in China. He also became a wine writer, eventually penning two wine books, and did a series of well-received wine education videos. In 2014, he founded a wine education school, and Liwen notes that he has been learning about Sake for many years.
Liwen wrote an introductory article, Sake-The Drop of Poetry, for the Wine Advocate, presenting some accurate, basic information about Sake, its ingredients, storage advice, serving suggestions, and more. In addition, Liwen presents Sake in an unpretentious manner, providing advice to make consumers feel better about knowing little about Sake. He states: "Some basic knowledge is needed if you want to look professional in front of others, but the best way is to find your own preference and use your own words to describe it." And as Liwen notes, this advice would apply to wine as well.
I liked this article and believe it could help interest more consumers in Sake. First, it presents basic Sake information in a brief and easily understood manner. Second, it helps to reassure consumers that anyone can enjoy Sake and that they should use their own words to describe the aroma and flavors of Sake. Third, when you consider the quote at the top of this post, it seems that Liwen understands the soul and aesthetics of Sake. However, I was less enamored with the scores accompanying the Sake reviews that came after this article.
A list of 78 Sake reviews was presented, including 66 Junmai Daiginjo and 12 Junmai Ginjo, with links to the descriptive reviews. Though the Sakes were technically evaluated on a scale that ranges from 50 to 100 points, not a single Sake scored less than 90 points, with the highest score being a 98. The scores can be broken down as such: 1 at 98 points, 2 at 95 points, 1 at 94 points, 5 at 93 points, 17 at 92 points, 23 at 91 points, and 29 at 90 points.
According to the Wine Advocate rating system, a score of 90-55 indicates "An outstanding wine of exceptional complexity and character." And a score of 96-100 indicates "An extraordinary wine of profound and complex character displaying all the attributes expected of a classic wine of its variety." By this rating system, all of the reviewed Sakes were outstanding with a single extraordinary one.
The 98 point Sake was the Kusumi Shuzo Kame-No-O Sannen Jukusei Junmai Daiginjo, priced at 10,000 yen (about $97). The two 95 point Sakes included the Iwase Shuzo Iwanoi Yamahai Junmai Daiginjo and Katsuyama Shuzo Katsuyama Akatsuki Junmai Daiginjo ($190). According to the Financial Times, the cheapest Sake on the list allegedly costs only 1500 yen ($14.50) though it was not identified. When perusing the descriptive reviews, you'll find that a number of the Sakes do not have listed prices.
Obviously, only top notch Sakes were selected to be reviewed, indicative of the high scores they all received. The Sake reviews were not intended to be a general overview of the range of available Sakes, by either type or quality, but rather a showcase of some of the best that is currently available. No Honjozo were included, and only a handful of different styles were included, such as Genshu and Namazake.
A few people have commented that these Sake scores will be a good thing, giving more visibility and promotion to Sake. However, my thoughts are different, and I believe they may potentially cause more harm than good. My current thoughts are consistent with my earlier Rant, though I see the need to expand upon those prior comments, especially as the last few years have seen a greater opposition against the 100 point wine system. I don't see a sufficient potential advantage to numeric scoring to outweigh the potential negatives.
Describe it, evaluate it, but don't score it.
First, the mere existence of numeric scores for Sake reviews from a major wine publication is certainly not a guarantee of increased consumption or sales, especially with the general public. I will note that there hasn't been any discussion yet of the Sake reviews on the Wine Advocate Forums. This could be indicative of a lack of interest in Sake to many of the Wine Advocate subscribers. It is still early, and Sake discussions could take place in the near future, but it is telling that despite a thread on the new issue of the Wine Advocate, there hasn't been mention of Sake yet. Whatever the reasons, it isn't a positive sign for Sake that the recent reviews aren't being mentioned.
We can also examine the status of other niche beverages, which have received wine scores for many years, but which haven't caught on with the general public. For example, Spanish Sherry still remains a tiny niche, especially the dry versions, currently selling even less in the U.S. than Sake. Scores didn't boost the general popularity of Sherry so why would it do so with Sake? A few high scoring Sherries might be cherished by wealthy collectors, but the average consumer could care less about Sherry. There are a number of other examples of niche wines, from Greek wine to Cremant d'Alsace, which certainly don't seem to have been helped significantly by the existence of scores.
Since the release of the Wine Advocate reviews last week, the Financial Times has already noted that initially, there has been a boost in sales of the reviewed Sakes, but it seems mainly from wealthy collectors and high-end restaurants & bars, including some seeking to buy large amounts of specific Sakes. This is only a tiny part of the market and doesn't include the average consumer. Sake scores might spur on wealthy collectors, but there isn't any evidence yet that there will be increased sake purchases by average consumers. And if wealthy collectors start buying all the highly scored Sakes, that will lead to my second point.
Second, one of the compelling aspects of Sake is its relative low ceiling on its highest prices. Usually, you won't find a Sake for more than $150 a bottle and prices are often closely aligned with the costs of producing Sake. There are exceptions but they are rare. Compare that to the wine world where there are plenty of wines costing more than $150. One of the significant factors that has led to those high wine prices are numeric scores. Wine stores may raise the prices of high scoring wines, pricing them out of the range of the average consumer. Do you want to see Sake prices rise merely because they garnered a high score? The effect of higher prices would likely decrease general consumption and drive more consumers away.
The Financial Times is in agreement, stating: "But the days of reasonably priced sake may be numbered: one of the drivers behind the list, said Ernest Singer, Robert Parker’s representative in Japan and a veteran wine importer, was to enable the best sake producers to raise prices." Thus, it seems that scores were specifically intended to raise prices, making my previous worry a reality. The alleged rationale for this matter was: "That in turn increases the odds of survival in a market where only 1,300-odd breweries are active and even the best are walking a financial tightrope."
But can the producers increase their production to meet demand? That is a real concern. In addition, should Sake breweries place their future merely in the hands of wealthy collectors, who might turn out to be fickle? As Sake generally should be consumed within a year of their release, it isn't the type of beverage that collectors can store away for years. It thus becomes less of an investment vehicle. It isn't like Bordeaux or California Cabernet. Its shorter life span might be an eventual turn off to wealthy collectors once they realize that fact. And then, they could move onto a different niche beverage, one that they can safely age in their cellars for many years.
Third, what Sake taste profile should critics base their numeric scores upon? In general, American palates prefer aromatic, big, bold and rich flavored Sakes. Is this a side effect of consumers following the perceived wine preferences of Robert Parker? Possibly. On the other hand, the Japanese generally prefer more subtle Sakes, which might have muted aromas, and which may be “as easy to drink as water.” Which style would or should garner high scores? If a prominent critic's numeric scores reward big, bold Sakes, then there could eventually be a backlash against such Sakes as there has been a backlash against so-called Parker style wines. Both styles need to be embraced, and neither style should be promoted over another.
Fourth, numeric scores could promote lazy and ignorant distributors, store owners, restaurants and other purveyors of Sake. As it stands, many of those people and establishments already need more basic education about Sake. If they learned more about Sake, they would be capable of selling more Sake, just as increased wine knowledge helps them sell more wine. They need to invest the time and effort into Sake education, just as they do wine. If these people can just point a customer to a high scoring Sake, making a recommendation merely based on a number, there is less incentive for them to learn about Sake. Scores give them an easy out.
There is no guarantee that a consumer is going to enjoy a Sake just because it receives a high score. Despite its high score, it might not be the style of Sake that the specific consumer would enjoy. And if a consumer tastes a high scoring Sake and doesn't like it, they might decide they don't like Sake at all. A mere numeric score also won't tell a consumer anything about which foods would best pair with a specific Sake. Though consumers are advised to not rely on just a score, but to also read the review, that is not what always happens in reality. A significant number of consumers find it much easier just to rely on a numeric score and not read the reviews.
Consumers are best served by educated wine store employees who can help them select the best Sakes for their preferences, as well as indicate the best food pairings for those Sakes. Wine store employees will take the time to learn about wine, and they should also take time to learn about Sake. It isn't that difficult of a subject, and will help them sell more Sake. Don't take the easy way out and just promote scores, rather than provide more constructive suggestions.
Fifth, there are some unanswered questions about the future of Sake reviews at the Wine Advocate. Most importantly, how often will they review Sake? Will it be a once a year event? If so, how will a single annual review effect general Sake consumption? It would seem that wouldn't help much, catering more to wealthy collectors who once a year stock up on highly rated Sakes. A once a year review also wouldn't do much to help consumption and raise consumer awareness throughout the rest of the year. Even if they review Sake quarterly, that still might not be sufficient to raise awareness for the general public.
There are other questions to consider as well. Will they only review Junmai Ginjo and Junmai Daiginjo? Or will they review the entire range of Sake types and styles? It would be better if they reviewed the entire range of premium Sakes, and expanded beyond the limited parameters of this initial review. Will future Sake reviews also be initially filtered through Haruo Matsuzaki before the final group is chosen by Liwen Hao? Or will Liwen make all of the selections on his own?
Finally, the 100 point system, as it has been used for wine, has received much criticism in recent years and those criticisms would generally apply to scoring Sake as well. You can find those criticisms listed in numerous online and published articles. There is little need to repeat all those items here.
To get more consumers to drink Sake, the first and most important thing to do it is to get them to taste premium Sake. Too many consumers have had a bad experience with hot Sake. However, once they taste a good, chilled Sake, their opinion can change. The taste of chilled premium Sake is drastically different from the taste of a cheap, hot Sake. It can be an eye opening experience and is more persuasive than any numeric score or tasting note. Wine stores need more Sake tastings. Restaurants need to offer inexpensive tasting flights of Sake, or hold Sake-paired dinners. The best education is tasting.
Sake doesn't need numeric scores!
--Liwen Hao in Wine Advocate, #226
It was inevitable. Back in 2013, I saw the writing on the wall, that it was coming, and I ranted that Sake Don't Need No Stinkin' Scores! At that time, I'd seen a few magazines and online Sake reviews that provided numeric scores for Sake, usually on a 100 point scale. However, it was relatively uncommon and seemed to have little to no impact. You didn't see those scores at local wine shops and I wasn't hearing anyone talking about them.
The May 2013 issue of Wine Spectator contained three articles on Sake, as well as tasting notes for over 50 Sakes. Rather than evaluate the Sakes by their usual 100 point system, Wine Spectator listed the Sakes as Good, Very Good or Outstanding. In comments on a post on the Colorado Wine Press, Thomas Matthews, the Executive Editor of Wine Spectator, mentioned, "We have much less experience with sake, and felt that broader categories would be more appropriate to express our opinions on their quality. However, I could easily see a critic with deeper experience in sake using the 100-point scale, and perhaps if we taste extensively enough, one day we will too."
It was easy to predict that the day would come when the Wine Spectator, or another major wine publication, would use the 100 point system to rate Sake. That day arrived last week when Robert Parker's Wine Advocate published reviews, with numeric scores, of 78 Sakes.
Technically, this isn't the first time that the Wine Advocate has provided scored reviews for Sake. Back in October 1998 (issue #199), Robert Parker wrote an article, The Sumo Taste (A Beginner's Guide to Understanding Sake), and reviewed 48 Sakes, the majority being Daiginjo, scoring them from 86 to 91. It has taken the Wine Advocate 18 years to start scoring Sake again.
For the new Sake reviews, famed Sake critic Haruo Matsuzaki, who John Gautner referred to as a "sake critic extraordinaire" and "the most respected critic in the industry, especially among the brewers themselves," first selected a group of top Junmai Sakes from an initial pool of 800. Then, Liwen Hao, the Asian Wine Reviewer for Wine Advocate, selected 78 Sakes, all Junmai Ginjo and Junmai Daiginjo, from the group chosen by Matuszaki, providing descriptive reviews and a numeric score.
The Wine Advocate announced the hiring of Liwen Hao back in December 2015, noting he would review Asian wine and other alcoholic beverages, as well as support the new Chinese version of RobertParker.com. Hao was born in Xi’An, raised in Shanghai, and first began working in the wine industry in 2004, taking a job with the ASC, the biggest wine importer in China. He also became a wine writer, eventually penning two wine books, and did a series of well-received wine education videos. In 2014, he founded a wine education school, and Liwen notes that he has been learning about Sake for many years.
Liwen wrote an introductory article, Sake-The Drop of Poetry, for the Wine Advocate, presenting some accurate, basic information about Sake, its ingredients, storage advice, serving suggestions, and more. In addition, Liwen presents Sake in an unpretentious manner, providing advice to make consumers feel better about knowing little about Sake. He states: "Some basic knowledge is needed if you want to look professional in front of others, but the best way is to find your own preference and use your own words to describe it." And as Liwen notes, this advice would apply to wine as well.
I liked this article and believe it could help interest more consumers in Sake. First, it presents basic Sake information in a brief and easily understood manner. Second, it helps to reassure consumers that anyone can enjoy Sake and that they should use their own words to describe the aroma and flavors of Sake. Third, when you consider the quote at the top of this post, it seems that Liwen understands the soul and aesthetics of Sake. However, I was less enamored with the scores accompanying the Sake reviews that came after this article.
A list of 78 Sake reviews was presented, including 66 Junmai Daiginjo and 12 Junmai Ginjo, with links to the descriptive reviews. Though the Sakes were technically evaluated on a scale that ranges from 50 to 100 points, not a single Sake scored less than 90 points, with the highest score being a 98. The scores can be broken down as such: 1 at 98 points, 2 at 95 points, 1 at 94 points, 5 at 93 points, 17 at 92 points, 23 at 91 points, and 29 at 90 points.
According to the Wine Advocate rating system, a score of 90-55 indicates "An outstanding wine of exceptional complexity and character." And a score of 96-100 indicates "An extraordinary wine of profound and complex character displaying all the attributes expected of a classic wine of its variety." By this rating system, all of the reviewed Sakes were outstanding with a single extraordinary one.
The 98 point Sake was the Kusumi Shuzo Kame-No-O Sannen Jukusei Junmai Daiginjo, priced at 10,000 yen (about $97). The two 95 point Sakes included the Iwase Shuzo Iwanoi Yamahai Junmai Daiginjo and Katsuyama Shuzo Katsuyama Akatsuki Junmai Daiginjo ($190). According to the Financial Times, the cheapest Sake on the list allegedly costs only 1500 yen ($14.50) though it was not identified. When perusing the descriptive reviews, you'll find that a number of the Sakes do not have listed prices.
Obviously, only top notch Sakes were selected to be reviewed, indicative of the high scores they all received. The Sake reviews were not intended to be a general overview of the range of available Sakes, by either type or quality, but rather a showcase of some of the best that is currently available. No Honjozo were included, and only a handful of different styles were included, such as Genshu and Namazake.
A few people have commented that these Sake scores will be a good thing, giving more visibility and promotion to Sake. However, my thoughts are different, and I believe they may potentially cause more harm than good. My current thoughts are consistent with my earlier Rant, though I see the need to expand upon those prior comments, especially as the last few years have seen a greater opposition against the 100 point wine system. I don't see a sufficient potential advantage to numeric scoring to outweigh the potential negatives.
Describe it, evaluate it, but don't score it.
First, the mere existence of numeric scores for Sake reviews from a major wine publication is certainly not a guarantee of increased consumption or sales, especially with the general public. I will note that there hasn't been any discussion yet of the Sake reviews on the Wine Advocate Forums. This could be indicative of a lack of interest in Sake to many of the Wine Advocate subscribers. It is still early, and Sake discussions could take place in the near future, but it is telling that despite a thread on the new issue of the Wine Advocate, there hasn't been mention of Sake yet. Whatever the reasons, it isn't a positive sign for Sake that the recent reviews aren't being mentioned.
We can also examine the status of other niche beverages, which have received wine scores for many years, but which haven't caught on with the general public. For example, Spanish Sherry still remains a tiny niche, especially the dry versions, currently selling even less in the U.S. than Sake. Scores didn't boost the general popularity of Sherry so why would it do so with Sake? A few high scoring Sherries might be cherished by wealthy collectors, but the average consumer could care less about Sherry. There are a number of other examples of niche wines, from Greek wine to Cremant d'Alsace, which certainly don't seem to have been helped significantly by the existence of scores.
Since the release of the Wine Advocate reviews last week, the Financial Times has already noted that initially, there has been a boost in sales of the reviewed Sakes, but it seems mainly from wealthy collectors and high-end restaurants & bars, including some seeking to buy large amounts of specific Sakes. This is only a tiny part of the market and doesn't include the average consumer. Sake scores might spur on wealthy collectors, but there isn't any evidence yet that there will be increased sake purchases by average consumers. And if wealthy collectors start buying all the highly scored Sakes, that will lead to my second point.
Second, one of the compelling aspects of Sake is its relative low ceiling on its highest prices. Usually, you won't find a Sake for more than $150 a bottle and prices are often closely aligned with the costs of producing Sake. There are exceptions but they are rare. Compare that to the wine world where there are plenty of wines costing more than $150. One of the significant factors that has led to those high wine prices are numeric scores. Wine stores may raise the prices of high scoring wines, pricing them out of the range of the average consumer. Do you want to see Sake prices rise merely because they garnered a high score? The effect of higher prices would likely decrease general consumption and drive more consumers away.
The Financial Times is in agreement, stating: "But the days of reasonably priced sake may be numbered: one of the drivers behind the list, said Ernest Singer, Robert Parker’s representative in Japan and a veteran wine importer, was to enable the best sake producers to raise prices." Thus, it seems that scores were specifically intended to raise prices, making my previous worry a reality. The alleged rationale for this matter was: "That in turn increases the odds of survival in a market where only 1,300-odd breweries are active and even the best are walking a financial tightrope."
But can the producers increase their production to meet demand? That is a real concern. In addition, should Sake breweries place their future merely in the hands of wealthy collectors, who might turn out to be fickle? As Sake generally should be consumed within a year of their release, it isn't the type of beverage that collectors can store away for years. It thus becomes less of an investment vehicle. It isn't like Bordeaux or California Cabernet. Its shorter life span might be an eventual turn off to wealthy collectors once they realize that fact. And then, they could move onto a different niche beverage, one that they can safely age in their cellars for many years.
Third, what Sake taste profile should critics base their numeric scores upon? In general, American palates prefer aromatic, big, bold and rich flavored Sakes. Is this a side effect of consumers following the perceived wine preferences of Robert Parker? Possibly. On the other hand, the Japanese generally prefer more subtle Sakes, which might have muted aromas, and which may be “as easy to drink as water.” Which style would or should garner high scores? If a prominent critic's numeric scores reward big, bold Sakes, then there could eventually be a backlash against such Sakes as there has been a backlash against so-called Parker style wines. Both styles need to be embraced, and neither style should be promoted over another.
Fourth, numeric scores could promote lazy and ignorant distributors, store owners, restaurants and other purveyors of Sake. As it stands, many of those people and establishments already need more basic education about Sake. If they learned more about Sake, they would be capable of selling more Sake, just as increased wine knowledge helps them sell more wine. They need to invest the time and effort into Sake education, just as they do wine. If these people can just point a customer to a high scoring Sake, making a recommendation merely based on a number, there is less incentive for them to learn about Sake. Scores give them an easy out.
There is no guarantee that a consumer is going to enjoy a Sake just because it receives a high score. Despite its high score, it might not be the style of Sake that the specific consumer would enjoy. And if a consumer tastes a high scoring Sake and doesn't like it, they might decide they don't like Sake at all. A mere numeric score also won't tell a consumer anything about which foods would best pair with a specific Sake. Though consumers are advised to not rely on just a score, but to also read the review, that is not what always happens in reality. A significant number of consumers find it much easier just to rely on a numeric score and not read the reviews.
Consumers are best served by educated wine store employees who can help them select the best Sakes for their preferences, as well as indicate the best food pairings for those Sakes. Wine store employees will take the time to learn about wine, and they should also take time to learn about Sake. It isn't that difficult of a subject, and will help them sell more Sake. Don't take the easy way out and just promote scores, rather than provide more constructive suggestions.
Fifth, there are some unanswered questions about the future of Sake reviews at the Wine Advocate. Most importantly, how often will they review Sake? Will it be a once a year event? If so, how will a single annual review effect general Sake consumption? It would seem that wouldn't help much, catering more to wealthy collectors who once a year stock up on highly rated Sakes. A once a year review also wouldn't do much to help consumption and raise consumer awareness throughout the rest of the year. Even if they review Sake quarterly, that still might not be sufficient to raise awareness for the general public.
There are other questions to consider as well. Will they only review Junmai Ginjo and Junmai Daiginjo? Or will they review the entire range of Sake types and styles? It would be better if they reviewed the entire range of premium Sakes, and expanded beyond the limited parameters of this initial review. Will future Sake reviews also be initially filtered through Haruo Matsuzaki before the final group is chosen by Liwen Hao? Or will Liwen make all of the selections on his own?
Finally, the 100 point system, as it has been used for wine, has received much criticism in recent years and those criticisms would generally apply to scoring Sake as well. You can find those criticisms listed in numerous online and published articles. There is little need to repeat all those items here.
To get more consumers to drink Sake, the first and most important thing to do it is to get them to taste premium Sake. Too many consumers have had a bad experience with hot Sake. However, once they taste a good, chilled Sake, their opinion can change. The taste of chilled premium Sake is drastically different from the taste of a cheap, hot Sake. It can be an eye opening experience and is more persuasive than any numeric score or tasting note. Wine stores need more Sake tastings. Restaurants need to offer inexpensive tasting flights of Sake, or hold Sake-paired dinners. The best education is tasting.
Sake doesn't need numeric scores!
Monday, August 31, 2015
Rant: Wines Over 100 Points?
Imagine this...
Famed wine reviewer, James Sackless, has declared that the 2016 Hi-alk Winery Napa Valley Cabernet Sauvignon has been awarded 103 points. How is that possible with a 100 point wine scoring system? There may now be a precedent which some wine reviewer could potentially use to start giving out 100+ points. That is a scary scenario.
Consumer Reports recently released their review of the all-wheel-drive Tesla Model S P85D sedan and stated that it broke their ratings systen. "The Tesla initially scored 103 in the Consumer Reports‘ Ratings system, which by definition doesn’t go past 100. The car set a new benchmark, so we had to make changes to our scoring to account for it." Could this be a sign of the future, that 100 point ratings systems may be inadequate?
Some wine magazines and reviewers use a 100 point scoring system for wine and it seems that those wine scores have generally been getting higher with time, that more and more wines are scoring in the 90s and more wines are being awarded 100 points too. As these scores continue to creep up, it isn't that hard to believe that some reviewer would like to award a wine more than 100 points, to set a new benchmark for what is seen as a quality wine. And the Tesla case could be used as a precedent to do so. Though the Tesla is an auto and not a bottle of wine, they both still use 100 point rating systems so there is some similarity.
The battle over the use of wine scores continues to rage on in the wine world. The rating system isn't going away any time soon, though it seems the number of people relying on wine scores is decreasing. Wine lovers are relying on alternatives, on the recommendations of trusted peers, wine shop staff, wine bloggers, and others. I won't go into all the arguments for and against wine scores, but the key here is wine scores have been losing credibility in some circles and that will continue, especially if a few reviewers start pushing for 100+ scores.
Consumers are better off without wine scores. It is never a guarantee that you will enjoy a wine. Instead, relyon your own taste, and those people you trust who have a similar palate. Experiment and broaden your tasting horizons, seeking out new wines as you never know where you might find a new favorite. With so many thousands of wines out there, opportunities for tasting are everywhere and you should take advantage whenever you can.
And if in the near future a wine receives a score over 100 points, it could be a sign of the wine apocalypse.
Famed wine reviewer, James Sackless, has declared that the 2016 Hi-alk Winery Napa Valley Cabernet Sauvignon has been awarded 103 points. How is that possible with a 100 point wine scoring system? There may now be a precedent which some wine reviewer could potentially use to start giving out 100+ points. That is a scary scenario.
Consumer Reports recently released their review of the all-wheel-drive Tesla Model S P85D sedan and stated that it broke their ratings systen. "The Tesla initially scored 103 in the Consumer Reports‘ Ratings system, which by definition doesn’t go past 100. The car set a new benchmark, so we had to make changes to our scoring to account for it." Could this be a sign of the future, that 100 point ratings systems may be inadequate?
Some wine magazines and reviewers use a 100 point scoring system for wine and it seems that those wine scores have generally been getting higher with time, that more and more wines are scoring in the 90s and more wines are being awarded 100 points too. As these scores continue to creep up, it isn't that hard to believe that some reviewer would like to award a wine more than 100 points, to set a new benchmark for what is seen as a quality wine. And the Tesla case could be used as a precedent to do so. Though the Tesla is an auto and not a bottle of wine, they both still use 100 point rating systems so there is some similarity.
The battle over the use of wine scores continues to rage on in the wine world. The rating system isn't going away any time soon, though it seems the number of people relying on wine scores is decreasing. Wine lovers are relying on alternatives, on the recommendations of trusted peers, wine shop staff, wine bloggers, and others. I won't go into all the arguments for and against wine scores, but the key here is wine scores have been losing credibility in some circles and that will continue, especially if a few reviewers start pushing for 100+ scores.
Consumers are better off without wine scores. It is never a guarantee that you will enjoy a wine. Instead, relyon your own taste, and those people you trust who have a similar palate. Experiment and broaden your tasting horizons, seeking out new wines as you never know where you might find a new favorite. With so many thousands of wines out there, opportunities for tasting are everywhere and you should take advantage whenever you can.
And if in the near future a wine receives a score over 100 points, it could be a sign of the wine apocalypse.
Wednesday, March 30, 2011
Muratie Wine Estate: Love, Devotion & Vines
Who would have thought? While attending a seminar on South African wines, I learned of an inspiring love story, a romance to stir even the coldest of hearts. Plus I tasted some delicious wines. And though I will recommend the wines, I will recommend even more strongly that you learn the story behind the winery, Muratie Wine Estate.
The seminar, held at Les Zygomates, was led by Rijk Melck (pictured above), the owner of Muratie. The winery is one of the oldest in South Africa, and is located in Stellenbosch, in the Simonsberg region. They were also the first winery to plant Pinot Noir in South Africa. Rijk was a compelling speaker, and I was fascinated by his history lesson of the region and winery.
In 1652, the Dutch established the colony of the Cape of Good Hope in South Africa, which was during the height of the slave trade. At one point, the Dutch captured a Portuguese slave ship and some of the slaves ended up in the Cape's Castle. The slaves were often permitted to walk around the gardens and market area during the day, before being locked up at night. One of those slaves bore a girl named Ansela.
In 1658, Laurens Campher, a German soldier working for the Dutch East Indian Company, was granted a farm at the foot of the Simonsberg Mountains, about 40 kilometers from the Cape. Now Laurens had also fallen in love with Ansela, though they had to keep their forbidden love secret. With extreme devotion and passion, Laurens regularly visited Ansela, which entailed a three-day walk on foot. For fourteen long years, Laurens made this journey, and Ansela bore him three children, though she was still a slave. Laurens surely loved Ansela, proving it with such dedication.
Finally, in 1699, Ansela received a Christian baptism and was then freed, the culmination of her dreams. Ansela and her children moved in with Laurens at what would become the Muratie estate. Grape vines had been planted on the estate and Ansela helped to ensure the estate was successful. The passion and devotion of Laurens and Ansela has reverberated throughout the centuries, helping to transform Muratie into a very successful winery as well as providing great inspiration.
Now lets travel to the present, to understand a bit about the Muratie Winery. The winery is not organically certified, but does follow the Biodiversity in Wine Initiative. In addition, they do not use pesticides, choosing instead to use animals, such as geese and wasps, to assist in pest control. (And also have two German Shepherds, named Frank Zappa and Stella Artois.) All harvesting is done by hand. Their wines are imported by Worthwhile Wine Company and locally distributed by Masciarelli Wine Company.
We got to taste five Muratie wines, and all were very good, showing a clear minerality. Rijk stated that they are not trying to make either Old World or New World wines. Instead, they are making wine that reflects the soil, their terroir, and the wines actually seem to be somewhere between the Old and New World styles. In general, Rijk states their wines are elegant, with good acidity, minerality and a long finish. In addition, all of their wines are made to be accompanied by food.
The 2009 Muratie Isabella Chardonnay is produced from 100% Chardonnay, which has spent nine months in French oak, only 40% new. The grapes come from three different vineyards, each picked at different times to emphasize different elements. For example, one group is picked early for more acidity. The wine is named after Rijk's daughter, Isabella. The wine is full-bodied, with some creaminess, and flavors of smoke, citrus, lemon and minerality. It was an intriguing wine, a bit different from many other Chardonnays, and would be a great option this summer.
The 2010 Muratie Laurens Campher White Blend is a mix of 39% Chenin Blanc, 32% Sauvignon Blanc, 24% Verdelho and 5% Chardonnay. This had an exotic taste, with flavors ranging from grapefruit to pineapple, and some floral notes. It was crisp and refreshing, with a strong minerality backbone, and a moderately long finish. Another good choice for the summer and I would like to try this with some fresh seafood.
The 2007 Muratie Shiraz is made from 100% Shiraz, from 16-18 year old vines. It spent about 16 months in 90% French oak and 10% American oak. This was a strong but not overpowering wine, with delicious spice and black fruit flavors, and underlying herbal elements. The tannins were moderate and the finish was long and satisfying.
The 2007 Muratie Ansela van de Caab is a blend of 48% Cabernet Sauvignon, 37% Merlot, 12% Cabernet Franc and 3% Shiraz. This wine spends about 18 months in French oak. This wine is more tannic than the Shiraz, with strong flavors of black fruit, cigar box and cocoa. Plus, it had more minerality than the Shiraz. This is a wine that cried out for a thick steak.
The 2008 Muratie Cape Vintage is an intriguing blend of Tinta Barocca, Tinta Roriz, Tinta Francesca, and Souzao. Muratie is the only winery in South Africa to use most of these grapes, except for the Tinta Roriz. This is a Port-style wine, aged for two years in old French barrels. It has a mild sweetness, with some notes of black cherry candy, but combined with dark spices. It possesses a good acidity as well as a pleasant and long finish.
Let me finish with another inspiring tale of Muratie. Rijk is a medical doctor, who worked for a time in England before returning to South Africa in 1980. At that time, most doctor offices had two doors, one for whites and one for blacks. Rijk though refused to do so, having only a single door for his practice. It can't have been easy to oppose the norm, yet Rijk stood his ground, opting for what was morally right. That too is inspiring and commendable.
The seminar, held at Les Zygomates, was led by Rijk Melck (pictured above), the owner of Muratie. The winery is one of the oldest in South Africa, and is located in Stellenbosch, in the Simonsberg region. They were also the first winery to plant Pinot Noir in South Africa. Rijk was a compelling speaker, and I was fascinated by his history lesson of the region and winery.
In 1652, the Dutch established the colony of the Cape of Good Hope in South Africa, which was during the height of the slave trade. At one point, the Dutch captured a Portuguese slave ship and some of the slaves ended up in the Cape's Castle. The slaves were often permitted to walk around the gardens and market area during the day, before being locked up at night. One of those slaves bore a girl named Ansela.
In 1658, Laurens Campher, a German soldier working for the Dutch East Indian Company, was granted a farm at the foot of the Simonsberg Mountains, about 40 kilometers from the Cape. Now Laurens had also fallen in love with Ansela, though they had to keep their forbidden love secret. With extreme devotion and passion, Laurens regularly visited Ansela, which entailed a three-day walk on foot. For fourteen long years, Laurens made this journey, and Ansela bore him three children, though she was still a slave. Laurens surely loved Ansela, proving it with such dedication.
Finally, in 1699, Ansela received a Christian baptism and was then freed, the culmination of her dreams. Ansela and her children moved in with Laurens at what would become the Muratie estate. Grape vines had been planted on the estate and Ansela helped to ensure the estate was successful. The passion and devotion of Laurens and Ansela has reverberated throughout the centuries, helping to transform Muratie into a very successful winery as well as providing great inspiration.
Now lets travel to the present, to understand a bit about the Muratie Winery. The winery is not organically certified, but does follow the Biodiversity in Wine Initiative. In addition, they do not use pesticides, choosing instead to use animals, such as geese and wasps, to assist in pest control. (And also have two German Shepherds, named Frank Zappa and Stella Artois.) All harvesting is done by hand. Their wines are imported by Worthwhile Wine Company and locally distributed by Masciarelli Wine Company.
We got to taste five Muratie wines, and all were very good, showing a clear minerality. Rijk stated that they are not trying to make either Old World or New World wines. Instead, they are making wine that reflects the soil, their terroir, and the wines actually seem to be somewhere between the Old and New World styles. In general, Rijk states their wines are elegant, with good acidity, minerality and a long finish. In addition, all of their wines are made to be accompanied by food.
The 2009 Muratie Isabella Chardonnay is produced from 100% Chardonnay, which has spent nine months in French oak, only 40% new. The grapes come from three different vineyards, each picked at different times to emphasize different elements. For example, one group is picked early for more acidity. The wine is named after Rijk's daughter, Isabella. The wine is full-bodied, with some creaminess, and flavors of smoke, citrus, lemon and minerality. It was an intriguing wine, a bit different from many other Chardonnays, and would be a great option this summer.
The 2010 Muratie Laurens Campher White Blend is a mix of 39% Chenin Blanc, 32% Sauvignon Blanc, 24% Verdelho and 5% Chardonnay. This had an exotic taste, with flavors ranging from grapefruit to pineapple, and some floral notes. It was crisp and refreshing, with a strong minerality backbone, and a moderately long finish. Another good choice for the summer and I would like to try this with some fresh seafood.
The 2007 Muratie Shiraz is made from 100% Shiraz, from 16-18 year old vines. It spent about 16 months in 90% French oak and 10% American oak. This was a strong but not overpowering wine, with delicious spice and black fruit flavors, and underlying herbal elements. The tannins were moderate and the finish was long and satisfying.
The 2007 Muratie Ansela van de Caab is a blend of 48% Cabernet Sauvignon, 37% Merlot, 12% Cabernet Franc and 3% Shiraz. This wine spends about 18 months in French oak. This wine is more tannic than the Shiraz, with strong flavors of black fruit, cigar box and cocoa. Plus, it had more minerality than the Shiraz. This is a wine that cried out for a thick steak.
The 2008 Muratie Cape Vintage is an intriguing blend of Tinta Barocca, Tinta Roriz, Tinta Francesca, and Souzao. Muratie is the only winery in South Africa to use most of these grapes, except for the Tinta Roriz. This is a Port-style wine, aged for two years in old French barrels. It has a mild sweetness, with some notes of black cherry candy, but combined with dark spices. It possesses a good acidity as well as a pleasant and long finish.
Let me finish with another inspiring tale of Muratie. Rijk is a medical doctor, who worked for a time in England before returning to South Africa in 1980. At that time, most doctor offices had two doors, one for whites and one for blacks. Rijk though refused to do so, having only a single door for his practice. It can't have been easy to oppose the norm, yet Rijk stood his ground, opting for what was morally right. That too is inspiring and commendable.
Wednesday, January 27, 2010
The Wine Trials 2010: A Lack of Transparency

Previously, Robin Goldstein scammed the Wine Spectator. Is he now trying to scam the general public with his new book, The Wine Trials 2010?
The other day, on Joe Robert's blog 1 Wine Dude, I read an interview with Robin Goldstein. Robin is the co-author of the newly released, The Wine Trials 2010, a book which lists 150 wines under $15 that allegedly beat $50-$150 bottles in "rigorous brown-bag blind tastings." I was initially skeptical of this book, having numerous questions about the methodology of the blind tastings and wine choices.
The other day, on Joe Robert's blog 1 Wine Dude, I read an interview with Robin Goldstein. Robin is the co-author of the newly released, The Wine Trials 2010, a book which lists 150 wines under $15 that allegedly beat $50-$150 bottles in "rigorous brown-bag blind tastings." I was initially skeptical of this book, having numerous questions about the methodology of the blind tastings and wine choices.
I first checked the Wine Trials website but it did not provide a description of their methodology. So I decided to buy a copy of the book to learn more. Unfortunately, my many questions have remained unanswered as the book did not contain what I sought.
The book is not fully transparent about their methodology and there is no explantion for why it was omitted. Without such transparency, I have strong doubts about the validity of the results. It is even more curious when the book provides a list of the members of their "Scientific Advisory board" who allegedly helped interpret the results and review their methods and conclusions. The authors provide "expert" credentials for the book, yet fail to allow the reader to review the methodology on their own. Readers are apparently expected to just trust the experts.
Only a few sparse comments are made in the book about their methodology. First, they used a list of 450 wines under $15, which eventually would be whittled down to 150 for inclusion in the book. Why did they start with a list of 450? Why not 500, or 1000? There is no explanation. Last year, the book only contained 100 wines and they were automatically included as part of this year's 450 wines. Why do that? It would seem to stack the deck a bit for those wines, giving them an added chance of being included in the new book.
To be nominated, and potentially placed on that list of 450 wines, a wine had to have a production minimum of 20,000, reduced from last year's 50,000 case minimum. Such a minimum ignores plenty of excellent, artisan wines under $15 which are not produced at such a high quantity. That seems to give preference for more mass produced wines, not exposing consumers to other possibilities.
The book also instituted a new nomination process this year, permitting wine professionals such as producers, sommeliers, importers, and retailers to nominate wines. First, doesn't that create potential conflicts of interest? Could a producer nominate his own wines? Could importers nominate the wines they import? Or could they nominate the wines of friends? How many total nominations were received? How were the nominations whittled down to 450? We should have much more information on this nomination process.
There were then a series of blind taste tests, but there is no information provided on how they were conducted, the number and demographics of the tasters, the experience levels of the tasters and much more. Which expensive wines were used for all of the blind taste tests? Who chose those expensive wines? How can we trust the results when there is almost no information provided on how the tests were conducted? The lack of transparency in this area really bothers me a lot.
Though the book spends plenty of time supporting the reasons in support of blind taste testing, they would have been better served by supporting their own specific results and providing details on the methodology. They spent time discussing a prior study they conducted, but fail to address the particular taste tests that led to the wines included in their book.
Some blame is placed on the major wine magazines for promoting the idea that expensive wines are better than inexpensive wines. But wine bloggers also receive some blame for their "passionate enjoyment of expensive wine." The book basically calls for all wine reviewers to conduct blind taste tests of the wines they review.
There is also a chapter that addresses some prior criticisms of the Wine Trials, much of it dealing with articles written by Eric Asimov, a New York Times wine writer. One of Asimov's major points is that wine is meant to be drunk with food, so it should be reviewed in that manner. The book actually agrees that "Most wine is better and more complex with food;... " (p.34) But the book does not feel you can "seriously evaluate" wine with food, and that doing so causes the same problems as non-blind tasting. But is that really so?
I guess it depends on what you mean by a serious evaluation, and the goal of your wine reviews. Are your reviews directed to wines people will drink on their own, or wines they will pair with food? The book does agree that a wine you drink on its own may not provide the same experience as if it is paired with food. But it appears the book's position is to provide reviews of wine to enjoy on their own. So, the wine recommendations are not as useful if someone is seeking a wine to pair with dinner.
Though the recommendations in this book are generally supposed to be for wine novices, even the authors have some caveats. First, they state to "...take our blind tasting results with a grain of salt." (p.56) They then continue to state: "To some extent, these choices might reflect the preferences of wine experts more than those of wine novices,..." (p.56) Obviously these caveats increase my skepticism.
I remain very skeptical of the Wine Trials and would like to see much more transparency from the authors. There are too many questions that have been left unanswered. As such, I cannot recommend this book.
Thursday, December 10, 2009
Does Advertising Skew Wine Spectator Wine Reviews?
We live in a world with many urban myths, so-called "truths" that are often accepted with a lack of any real evidence. It is always curious why such urban myths spread and are accepted. It may partially be due to such myths appealing to our prejudices.
As an example, some people have accused magazines like Wine Spectator of being biased towards wineries that advertise in their magazine. They seem to think that the large, full page ads cause reviewers to favor those wines. Yet these allegations are never supported by actual evidence. So what is the truth? Are the accuser simply prejudiced against Wine Spectator?
An actual study has now been conducted on this issue and the report has just been issued. Check out Does Advertising Bias Product Reviews? An Analysis of Wine Ratings by Jonathan Reuter (Journal of Wine Economics, Volume 4, Issue 2, Winter 2009, Pages 125–151). Reuter is a local person, working at the Department of Finance of the Carroll School of Management at Boston College.
To test this issue, Reuter compared two wine publications, Wine Spectator and Wine Advocate. As the Wine Advocate does not accept advertising, he felt it would be appropriate to compare its wine scores with those of the Wine Spectator, which does accept advertising. I think it might have been a better, and more comprehensive, study if Reuter had also compared the Wine Enthusiast, or other major wine magazines, which accept advertising.
His basic conclusion shatters the urban myth: "Overall, the tests for biased ratings and biased awards produce little consistent evidence that Wine Spectator favors advertisers." So will this myth die out? Probably not, as some prefer to believe this rather than confront the facts.
Taken in the worst light against Wine Spectator, the study suggested that wines from advertisers may score almost one point higher than wines from nonadvertisers, as compared to the wine scores in the Wine Advocate. But Reuter also stated this "..is also consistent with the two publications evaluating wine using different standards, perhaps because they cater to different consumer tastes." There are also other reasons why that might be the case. In addition, less than one point certainly is not a very significant difference, and could well be in an acceptable margin of error.
Though some people may try to cling to this less than one point difference to further the urban myth, they are holding onto something very shaky, as well as ignoring numerous other factors. It is significant that Reuter found that "..,Wine Spectator is no more likely to bestow awards upon advertisers." This helps support the conclusion that advertising does not bias Wine Spectator reviews.
This might also be a wake-up call to advertisers who felt they were getting better reviews based on all of their advertising dollars. Their money was not buying them anything extra.
Let us continue to shatter wine's urban myths!
As an example, some people have accused magazines like Wine Spectator of being biased towards wineries that advertise in their magazine. They seem to think that the large, full page ads cause reviewers to favor those wines. Yet these allegations are never supported by actual evidence. So what is the truth? Are the accuser simply prejudiced against Wine Spectator?
An actual study has now been conducted on this issue and the report has just been issued. Check out Does Advertising Bias Product Reviews? An Analysis of Wine Ratings by Jonathan Reuter (Journal of Wine Economics, Volume 4, Issue 2, Winter 2009, Pages 125–151). Reuter is a local person, working at the Department of Finance of the Carroll School of Management at Boston College.
To test this issue, Reuter compared two wine publications, Wine Spectator and Wine Advocate. As the Wine Advocate does not accept advertising, he felt it would be appropriate to compare its wine scores with those of the Wine Spectator, which does accept advertising. I think it might have been a better, and more comprehensive, study if Reuter had also compared the Wine Enthusiast, or other major wine magazines, which accept advertising.
His basic conclusion shatters the urban myth: "Overall, the tests for biased ratings and biased awards produce little consistent evidence that Wine Spectator favors advertisers." So will this myth die out? Probably not, as some prefer to believe this rather than confront the facts.
Taken in the worst light against Wine Spectator, the study suggested that wines from advertisers may score almost one point higher than wines from nonadvertisers, as compared to the wine scores in the Wine Advocate. But Reuter also stated this "..is also consistent with the two publications evaluating wine using different standards, perhaps because they cater to different consumer tastes." There are also other reasons why that might be the case. In addition, less than one point certainly is not a very significant difference, and could well be in an acceptable margin of error.
Though some people may try to cling to this less than one point difference to further the urban myth, they are holding onto something very shaky, as well as ignoring numerous other factors. It is significant that Reuter found that "..,Wine Spectator is no more likely to bestow awards upon advertisers." This helps support the conclusion that advertising does not bias Wine Spectator reviews.
This might also be a wake-up call to advertisers who felt they were getting better reviews based on all of their advertising dollars. Their money was not buying them anything extra.
Let us continue to shatter wine's urban myths!
Saturday, November 21, 2009
90+ Cellars: Good & Inexpensive Wines
While persuing the shelves of your local wine store, you may see a label marked 90+ Cellars. This label is currently sold in about 90 wine/liquor stores in Massachusetts, as well as being available in about twenty other states. But what is the story behind this label, and should you take a chance on this wine?
90+ Cellars, a Boston based company, is a virtual winery where they purchase excess wines from established wineries and sell it under their own label, and at a significant discount. The wineries sell this excess wine because either they produced too much or the demand for that specific wine has waned. These wines have already been produced and bottled so all that needs to be done is for 90+ Cellars to have their label placed onto the bottle.
They don't just buy any excess wine. There is criteria used to determine which wines they will sell. The criteria is that the wines "...must have a pedigree of 90 or higher ratings, best buy or gold medal accolades from major publications." They cannot though tell the consumer the name of the original winery. That fact must remain anonymous.
Each of their wines is assigned a “Lot” number upon release and Lot #15 will be their next wine available, in a couple months. All of the wines are available in limited quantity. Their Reserve selections may only be available in 100-200 cases, while the normal line will be available in the thousands. But once a Lot is gone, it will no longer ever be available.
Recently, I met a couple of the guys behind 90+ Cellars, including Kevin and Brett. They invited some local wine writers and wine store owners to taste some of their current wines, and potential future releases. Prior to the event, I had some questions and concerns about 90+ Cellars and intended to raise them at the event. I was unsure how they would handle these issues as they might not be the easiest.
I was extremely pleased that Kevin and Brett were honest and forthright in response to my questions. They did not try to evade or obfuscate the issues. Such integrity impressed me. They were also able to clarify several issues for me.
Why do they rely on 90+ ratings? First, they do not rely on only such ratings, and at least one of their wines has not received a 90+ rating. As mentioned above, the 90+ rating is only one such possible criteria. They will also consider other significant accolades. Though the label may imply to some that they only rely on 90+ ratings, their website does mention the other criteria.
Second, 90+ ratings are important to them because many consumers do consider such ratings when buying wines. They are a business, trying to make money, so it makes financial sense to use ratings if many consumers rely upon them. Especially if they are a new company trying to enter the market.
Personally, I would rather consumers relied less on such scores, and were more willing to try wines that maybe did not score as high, or which lack any score. There are plenty of excellent wines that fail to attain a 90+ rating. But I do understand why wineries and stores use scores to promote wines.
Another concern for me is that the 90+ Cellar wines generally lack a story, or at least one which can be disseminated to the public. Because of the anonymity of the wines they sell, the consumer cannot learn about the actual winery, cannot heard about the stories behind the wine. Kevin admits that is an issue, and not something that really can be changed.
But what 90+ Cellars does offer are very good wines, at a significant discount from the original wine. I tasted through many of their wines, and they were generally very good. None of the wines were bad, though some were not my preferred style. My favorites included:
Lot #6 ($13.99), an Unoaked Chardonnay from Australia was excellent, a crisp wine with delicious fruit.
Lot #8 ($11.99), a Garnacha from Spain, a delightful melange of bright fruits, including blueberry, some spice notes and a touch of herbal. Very easy drinking and fun wine.
Lot #15 (soon to be released), a Pinot Noir from the Carneros region of California. This Pinot has not received a 90+ rating, but I know the source and it is a top producer, as well as one of my favorites. This wine was excellent, and a great value.
So will you like the wines of 90+ Cellars? If you want a delicious wine, at a good price, then definitely give them a try. You will miss out on the story of the wine, but that may not matter to you. I certainly would buy some of these wines, based on their taste and low price. So keep an eye out for these wines.
90+ Cellars, a Boston based company, is a virtual winery where they purchase excess wines from established wineries and sell it under their own label, and at a significant discount. The wineries sell this excess wine because either they produced too much or the demand for that specific wine has waned. These wines have already been produced and bottled so all that needs to be done is for 90+ Cellars to have their label placed onto the bottle.
They don't just buy any excess wine. There is criteria used to determine which wines they will sell. The criteria is that the wines "...must have a pedigree of 90 or higher ratings, best buy or gold medal accolades from major publications." They cannot though tell the consumer the name of the original winery. That fact must remain anonymous.
Each of their wines is assigned a “Lot” number upon release and Lot #15 will be their next wine available, in a couple months. All of the wines are available in limited quantity. Their Reserve selections may only be available in 100-200 cases, while the normal line will be available in the thousands. But once a Lot is gone, it will no longer ever be available.
Recently, I met a couple of the guys behind 90+ Cellars, including Kevin and Brett. They invited some local wine writers and wine store owners to taste some of their current wines, and potential future releases. Prior to the event, I had some questions and concerns about 90+ Cellars and intended to raise them at the event. I was unsure how they would handle these issues as they might not be the easiest.
I was extremely pleased that Kevin and Brett were honest and forthright in response to my questions. They did not try to evade or obfuscate the issues. Such integrity impressed me. They were also able to clarify several issues for me.
Why do they rely on 90+ ratings? First, they do not rely on only such ratings, and at least one of their wines has not received a 90+ rating. As mentioned above, the 90+ rating is only one such possible criteria. They will also consider other significant accolades. Though the label may imply to some that they only rely on 90+ ratings, their website does mention the other criteria.
Second, 90+ ratings are important to them because many consumers do consider such ratings when buying wines. They are a business, trying to make money, so it makes financial sense to use ratings if many consumers rely upon them. Especially if they are a new company trying to enter the market.
Personally, I would rather consumers relied less on such scores, and were more willing to try wines that maybe did not score as high, or which lack any score. There are plenty of excellent wines that fail to attain a 90+ rating. But I do understand why wineries and stores use scores to promote wines.
Another concern for me is that the 90+ Cellar wines generally lack a story, or at least one which can be disseminated to the public. Because of the anonymity of the wines they sell, the consumer cannot learn about the actual winery, cannot heard about the stories behind the wine. Kevin admits that is an issue, and not something that really can be changed.
But what 90+ Cellars does offer are very good wines, at a significant discount from the original wine. I tasted through many of their wines, and they were generally very good. None of the wines were bad, though some were not my preferred style. My favorites included:
Lot #6 ($13.99), an Unoaked Chardonnay from Australia was excellent, a crisp wine with delicious fruit.
Lot #8 ($11.99), a Garnacha from Spain, a delightful melange of bright fruits, including blueberry, some spice notes and a touch of herbal. Very easy drinking and fun wine.
Lot #15 (soon to be released), a Pinot Noir from the Carneros region of California. This Pinot has not received a 90+ rating, but I know the source and it is a top producer, as well as one of my favorites. This wine was excellent, and a great value.
So will you like the wines of 90+ Cellars? If you want a delicious wine, at a good price, then definitely give them a try. You will miss out on the story of the wine, but that may not matter to you. I certainly would buy some of these wines, based on their taste and low price. So keep an eye out for these wines.
Friday, March 27, 2009
Robert Parker Interview at Bitter Lawyer
Robert Parker remains a controverisal and powerful individual in the wine world. I always find it intriguing to learn more about him, especially through interviews. So when I learned of a new interview with him, I was pleased to check it out and I would recommend it to others too.
The new interview with Robert Parker can be found over at Bitter Lawyer, a site that states: "Our singular goal is to create an engaging, insightful entertainment destination for lawyers." This is not the usual place you might visit seeking wine information but there is a clear connection as Parker began as an attorney. The interview delves into his life with wine, but also touches on his life as an attorney.
I think you might learn some new things about Parker from this Q&A as I certainly did. For example, Parker's 100 point wine rating system was used at the University of Maryland Law School. You'll learn what wine Parker drank when celebrating leaving his law job. Parker also presents some general recommendations for economical wines.
And one of Parker's last statements in the interview really resonates with me. "But if you pursue your greatest passion, chances are you will not only become very good at whatever that passion is, but by being good, you will also love what you are doing, and probably make a sufficient amount of money to live very comfortably."
It is all about passion.
The new interview with Robert Parker can be found over at Bitter Lawyer, a site that states: "Our singular goal is to create an engaging, insightful entertainment destination for lawyers." This is not the usual place you might visit seeking wine information but there is a clear connection as Parker began as an attorney. The interview delves into his life with wine, but also touches on his life as an attorney.
I think you might learn some new things about Parker from this Q&A as I certainly did. For example, Parker's 100 point wine rating system was used at the University of Maryland Law School. You'll learn what wine Parker drank when celebrating leaving his law job. Parker also presents some general recommendations for economical wines.
And one of Parker's last statements in the interview really resonates with me. "But if you pursue your greatest passion, chances are you will not only become very good at whatever that passion is, but by being good, you will also love what you are doing, and probably make a sufficient amount of money to live very comfortably."
It is all about passion.
Tuesday, December 2, 2008
Wine Advocate Still Has Power
Decanter (11/08) reported on an interesting tale of the continuing power of the Wine Advocate and their scores.
Farr Vintners, which specializes in high end wines, was having difficulty selling Taylors Vintage Port 2003, despite it being an excellent wine. From November 2005 to June 2008, they had only sold 15 cases. But in July 2008, over the course of only two days, they sold more than 150 cases of this wine. And they could have sold even more if they had more cases in stock.
What happened was that the Wine Advocate gave this wine 100 points, a perfect score. Jay Miller provides the scores for Ports for the Wine Advocate. So, with this score, the demand for the wine skyrocketed, indicating that the Wine Advocate still has great power, even if the review is not specifically from Robert Parker.
Yet if this wine was so good, then why didn't others see that before the Wine Advocate score was released? There is much I could say here yet it would be repetitious of much I and others have said before. Just know there are great wines out there, still undiscovered by the primary wine print media, and that you should seek such wines out.
Farr Vintners, which specializes in high end wines, was having difficulty selling Taylors Vintage Port 2003, despite it being an excellent wine. From November 2005 to June 2008, they had only sold 15 cases. But in July 2008, over the course of only two days, they sold more than 150 cases of this wine. And they could have sold even more if they had more cases in stock.
What happened was that the Wine Advocate gave this wine 100 points, a perfect score. Jay Miller provides the scores for Ports for the Wine Advocate. So, with this score, the demand for the wine skyrocketed, indicating that the Wine Advocate still has great power, even if the review is not specifically from Robert Parker.
Yet if this wine was so good, then why didn't others see that before the Wine Advocate score was released? There is much I could say here yet it would be repetitious of much I and others have said before. Just know there are great wines out there, still undiscovered by the primary wine print media, and that you should seek such wines out.
Monday, April 28, 2008
Even Experts Disagree
Scores, Scores, Scores.
Wine points are all around us. Many retailers tout scores in efforts to get people to buy wines. And such marketing is effective as there are a significant amount of people who do buy their wines based on wine scores. But can such scores be trusted? Do the wine "experts" basically agree on scores for the same wines?
This topic is starting to get trite. It has been discussed time and time again here and on plenty of other wine blogs. Yet I had to raise it again because of a recent incident that brought it before my eyes once again. While researching the wines of Paolo De Marchi, I saw some conflicting scores for his Cepparello wine.
First, the April 2008 issue of Decanter had an article declaring the Cepparello to be one of Italy's 50 Greatest Ever Wines. It did not score any of the vintages but stated it was consistently excellent. The aticle was based on questions asked of 19 Italian wine experts from four different countries.
Second, the Wine Advocate gave 94 points to the 2003 Cepparello and 95 points to the 2004 Cepparello. This would seem to be in sync with the Decanter article.
Third, Steven Tanzer gave 93 points to the 2003 Cepparello and 89+ points to the 2004 Cepparello. So he felt the 2003 vintage was better and his score for the 2004 differed by 6 points.
Lastly, the Wine Spectator gave 88 points to the 2003 Cepparello and 86 points to the 2004 Cepparello. So their score difered from the Wine Advocate by 6 points for the 2003 AND 9 points for the 2004. 9 points??? That is a very significant difference. Why is it so different?
Let us look at the tasting notes for the 2004 to see if we can get a clue.
Wine Advocate: "The estate's 2004 Cepparello (100% Sangiovese aged in French oak, 1/3 new) was made from minuscule yields of just 600 grams per plant and is even better than the 2003. It exhibits a livelier color, fresher aromatics and a nuanced personality, all the products of a more balanced growing season. It boasts layers of vibrant fruit intermingled with subtle mineral and licorice notes, showing outstanding length on the palate and fine, noble tannins. A wine of extraordinary elegance, it has been stunning on the two occasions I have tasted it so far. That said, readers who want to experience this wine's full array of tertiary notes will have to give this wine time to mature in the bottle. It is highly recommended. Anticipated maturity 2009-2022."
Wine Spectator: "Aromas of black cherry and flowers follow through to a medium body, with fine tannins and a delicate finish. Sangiovese. Best after 2008."
I cannot see based on those tasting notes why the Wine Spectator did not give a higher score to this wine. They did not indicate any problems with the wine.
So which score should would you follow? And why? What I think it indicates is that wine scores are really so personal a matter, that wine preferences vary greatly from person to person, that scores don't have a lot of value. If even the wine experts can vary so significantly in their scores for a wine, then where is the value in those scores?
Let your own taste be the ultimate judge of whether a wine is good or not.
Wine points are all around us. Many retailers tout scores in efforts to get people to buy wines. And such marketing is effective as there are a significant amount of people who do buy their wines based on wine scores. But can such scores be trusted? Do the wine "experts" basically agree on scores for the same wines?
This topic is starting to get trite. It has been discussed time and time again here and on plenty of other wine blogs. Yet I had to raise it again because of a recent incident that brought it before my eyes once again. While researching the wines of Paolo De Marchi, I saw some conflicting scores for his Cepparello wine.
First, the April 2008 issue of Decanter had an article declaring the Cepparello to be one of Italy's 50 Greatest Ever Wines. It did not score any of the vintages but stated it was consistently excellent. The aticle was based on questions asked of 19 Italian wine experts from four different countries.
Second, the Wine Advocate gave 94 points to the 2003 Cepparello and 95 points to the 2004 Cepparello. This would seem to be in sync with the Decanter article.
Third, Steven Tanzer gave 93 points to the 2003 Cepparello and 89+ points to the 2004 Cepparello. So he felt the 2003 vintage was better and his score for the 2004 differed by 6 points.
Lastly, the Wine Spectator gave 88 points to the 2003 Cepparello and 86 points to the 2004 Cepparello. So their score difered from the Wine Advocate by 6 points for the 2003 AND 9 points for the 2004. 9 points??? That is a very significant difference. Why is it so different?
Let us look at the tasting notes for the 2004 to see if we can get a clue.
Wine Advocate: "The estate's 2004 Cepparello (100% Sangiovese aged in French oak, 1/3 new) was made from minuscule yields of just 600 grams per plant and is even better than the 2003. It exhibits a livelier color, fresher aromatics and a nuanced personality, all the products of a more balanced growing season. It boasts layers of vibrant fruit intermingled with subtle mineral and licorice notes, showing outstanding length on the palate and fine, noble tannins. A wine of extraordinary elegance, it has been stunning on the two occasions I have tasted it so far. That said, readers who want to experience this wine's full array of tertiary notes will have to give this wine time to mature in the bottle. It is highly recommended. Anticipated maturity 2009-2022."
Wine Spectator: "Aromas of black cherry and flowers follow through to a medium body, with fine tannins and a delicate finish. Sangiovese. Best after 2008."
I cannot see based on those tasting notes why the Wine Spectator did not give a higher score to this wine. They did not indicate any problems with the wine.
So which score should would you follow? And why? What I think it indicates is that wine scores are really so personal a matter, that wine preferences vary greatly from person to person, that scores don't have a lot of value. If even the wine experts can vary so significantly in their scores for a wine, then where is the value in those scores?
Let your own taste be the ultimate judge of whether a wine is good or not.
Friday, June 22, 2007
Wine Rating Systems
Seems there is a recent flood of articles and blog posts concerning Wine Rating systems. There is an interesting article in the San Francisco Chronicle titled "Are Ratings Pointless." One interesting point the article raised is that wines are generally only rated once, though wine does change over time. So, why isn't wine rated over time, to see how well it ages? No clear answer but it certainly raises a good point.
There is also a movement in the wine blog community to try to standardize a wine rating system just for bloggers. They want to differentiate themselves from the wine professionals, such as the Wine Advocate and the Wine Spectator. Two bloggers, WineCast and Catavino, have addressed this issue. They are pushing for a 5 point/star system. I don't agree that a 5 point system is really that much better.
What exactly do we want our Ratings to accomplish? And are those purposes better off served in Tasting Notes rather than a rating? I think the basic purpose is to guide people to wines they will like. It is not to create a set of trophy wines that people will seek out, to the detriment of good wines that just don't happen to be the very best. The 100 point system is often cited as flawed because there are people who only seek wines rated 90 and above. They ignore good wines just because they are not rated high enough. Yet, a 5 point system is subject to the same problem, that people might only seek 4 or 5 point wines, ignoring all others.
The Real World Winer devised a very simple Rating system, which I still use, with only 3 categories.
1) Drink & Buy: A wine I recommend as worthy of buying.
2) Drink Not Buy: A wine that is drinkable but not something I would buy myself.
3) No Drink No Buy. A wine I would not recommend at all.
Everything else you need to know about the wine is in the Tasting Notes. My goal is to direct people toward certain wines, regardless of price. So, an excellent $10 wine or an excellent $200 wine could both be in the first category. The tasting notes will mention the price, and also whether I consider it a good value. Thus, people can enjoy good and superb wines, without just seeking the trophy wines.
I thus hope that more people will drink wines that would only receive good, but not excellent points, in other systems. Why miss out on so many good wines?
There is also a movement in the wine blog community to try to standardize a wine rating system just for bloggers. They want to differentiate themselves from the wine professionals, such as the Wine Advocate and the Wine Spectator. Two bloggers, WineCast and Catavino, have addressed this issue. They are pushing for a 5 point/star system. I don't agree that a 5 point system is really that much better.
What exactly do we want our Ratings to accomplish? And are those purposes better off served in Tasting Notes rather than a rating? I think the basic purpose is to guide people to wines they will like. It is not to create a set of trophy wines that people will seek out, to the detriment of good wines that just don't happen to be the very best. The 100 point system is often cited as flawed because there are people who only seek wines rated 90 and above. They ignore good wines just because they are not rated high enough. Yet, a 5 point system is subject to the same problem, that people might only seek 4 or 5 point wines, ignoring all others.
The Real World Winer devised a very simple Rating system, which I still use, with only 3 categories.
1) Drink & Buy: A wine I recommend as worthy of buying.
2) Drink Not Buy: A wine that is drinkable but not something I would buy myself.
3) No Drink No Buy. A wine I would not recommend at all.
Everything else you need to know about the wine is in the Tasting Notes. My goal is to direct people toward certain wines, regardless of price. So, an excellent $10 wine or an excellent $200 wine could both be in the first category. The tasting notes will mention the price, and also whether I consider it a good value. Thus, people can enjoy good and superb wines, without just seeking the trophy wines.
I thus hope that more people will drink wines that would only receive good, but not excellent points, in other systems. Why miss out on so many good wines?
Monday, May 14, 2007
My Wine Rating System
In my numerous wine reviews, I have been using a rating system that the Real World Winers developed for our original blog. It is a rather simple system with three basic rating categories. These are: Drink & Buy, Drink Not Buy, and No Drink No Buy.
Drink & Buy: These are wines that I enjoy drinking and consider worth their cost. These are the wines I would recommend to others. Within this rating, there is also a special subcategory, the Value wines. These are wines that I consider to be worth more than their cost, and thus are a very good value. Such value wines generally cost $25 or less.
Drink Not Buy: These are ordinary wines that I could drink but generally would not buy myself. I don't consider them worth their price. I would drink them at a function, or if someone else was buying. But I would save my own money for wines in the first category.
No Drink No Buy: These are wines I would not even drink if they were free. I would not recommend these wines to anyone!
As you can see, it is very simple. And that probably won't change as I don't see that I need anything more complex.
Drink & Buy: These are wines that I enjoy drinking and consider worth their cost. These are the wines I would recommend to others. Within this rating, there is also a special subcategory, the Value wines. These are wines that I consider to be worth more than their cost, and thus are a very good value. Such value wines generally cost $25 or less.
Drink Not Buy: These are ordinary wines that I could drink but generally would not buy myself. I don't consider them worth their price. I would drink them at a function, or if someone else was buying. But I would save my own money for wines in the first category.
No Drink No Buy: These are wines I would not even drink if they were free. I would not recommend these wines to anyone!
As you can see, it is very simple. And that probably won't change as I don't see that I need anything more complex.
Subscribe to:
Posts (Atom)