In following up on our editorial of “When is a 90 not a 90”, I wanted to share my thoughts on the concept and the results. I would be remiss if I didn’t add the disclaimer that these opinions that are being shared are mine and not necessarily those of the site, the other members of Developing Palates or our sponsors.
A question we received a few times in response to the article was why we didn’t include our own review results in the article. There are a couple of reasons for that. One, we don’t use the 100 point system, so drawing a comparison between the two would be very tough. And two, is that with the time window we used for data collection, being a new site, we only have 18 months worth of data, which would have been only half of the data collected from the other sites. With that said, I will share the results below.
From a total of 367 review scores, our overall average score from all reviewers is 6.59. Based on the wide range of scores we use, it’s not easy to show the histogram of our score distribution, but I can share that our most frequent scores given were in the range of 6.91 to 7 of which we had 29. On both sides of the most frequent score, we have 138% more scores on the low side than the high side (238 compared to 100).
With that out of the way, let’s get onto the opinions.
The Concept
I was very interested in running the numbers as I was pretty positive based on my reading of all of the sites referenced, as well as others that we couldn’t get the scores for and ones that use other scoring systems, that there was scoring inflation based on what the well known scoring range is, as well as the targeted tipping point number of 90.
This same thing has been a big contention in the wine world with scoring becoming inflated. It has happened with the very well known reviewers inflating their own scores over time as well as newer reviewers adopting the scoring system and starting with an inflated range.
The tie in is heavy between wine and cigars as the 100 point wine scoring system was pioneered by Robert Parker for wine, which was then adopted by Wine Spectator magazine which is owned by Marvin Shanken who then started Cigar Aficionado. Cigar Aficionado used this similar scoring system for cigars which was then adopted by many other media outlets, the ones from this article included.
Now, the sites that adopted this system may have variations in how they use the system, but to the average reader, unless there is something that specifically details what the variations are, they are most likely going to tie any 100 point scoring system to “the” 100 point scoring system (Robert Parker/Wine Spectator/Cigar Aficionado) and where the average scores tended to be and what number tended to indicate a cigar that needs to be paid attention to. In my experience, that money number is the 90 rating. I feel there are far too many 90’s and above being handed out which is why I wanted to see if my feeling had any validity to it.
The Results
After all of the data was collected and analyzed, many of my assumptions were confirmed. There was something that did surprise me a bit which was the scoring of Cigar Aficionado and Halfwheel. I had thought that there might have been some slight inflation for Cigar Aficionado and there probably has been some of it, but with 89 being their average and most frequent score, it still held that ground of slightly protecting the 90 range. It also showed clearly that their use of a 91 score as a qualifier for their cigar of the year tournament is an appropriate choice.
In regards to Halfwheel, I wasn’t quite sure what to expect from the results. One of the things that I like quite a bit about their use of the scoring system is that they push outside of the typical 10 point window within the 100 point system that others seem trapped in. They use it much like Robert Parker used it years ago for wine, as a wide range of scores to be able to show differences between cigars. In my opinion, Halfwheel uses the 100 point system the most appropriately in regards to the range of scores given. That doesn’t mean I necessarily agree with the scores given to particular cigars, but for the use of the system as a whole.
As for the results from the other sites, there weren’t really any surprises other than maybe how inflated their scores seemed. I’ll give some thoughts on the results from each.
Blind Man’s Puff does publish an explanation for their scoring range which shows the range of 90 to 91 as Very Good. This is where the bulk of the reviews reside. Then, it’s 89 which is Good which is followed by 92 which is Great. Doing a bit of comparison to the range explanation compared to Cigar Aficionado, they seem to have increased the number of descriptors and moved them further up the number scale. Whether this is to capitalize on people’s association with the numbers used in Cigar Aficionado or not, I cannot say. I also think there is a flaw in the scoring system they use that can inflate some scores and may hurt the scores of some cigars and that is the weighting of the categories. They make their review formula and results public, so anyone is able to see the results if they care to look enough at it. Appearance, pre-light aroma, draw, burn and construction account for 45% of the score while flavor only accounts for 30%. The remaining 25% comes from an overall rating which includes a combination of everything in the experience. When so little of the rating comes from flavor, it’s easy to see how well constructed cigars (as most are nowadays) can contribute to the bulk of cigars reviewed being Very Good. Another flaw in their system is that they don’t audit their reviews. There are cases where the notes and score for a particular category don’t match up. The reviewer could say something along the lines of a burn being very bad but the score is high. These conflicts, not saying that they happen on a regular basis, don’t seem to be addressed and are passed through the system which has an effect on the overall score.
Cigar Coop also publishes an explanation of his scoring range which shows the range of 91-93 as an Excellent Smoke. This range is where the most frequent score of 91 is given. Another bulk of cigars are in the 89-90 range which is defined as Very Good. The scoring range is fairly narrow as he mentions that he typically won’t assess anything that is below an 85. The bulk of all scores fit into a 5 point range from 89-93, so it’s tough to see much differentiation between a lot of cigars. The categories he uses in the rating process are listed, but the weighting and formula used to derive the scores is not published, so it’s hard to say if anything in that process contributes to some inflation. The site is on the higher end of average scoring and it would be good to see an adjustment down in average scoring and a wider distribution of scores. He has made an adjustment to his scoring for the 2018 year which I believe is designed to bring down the scoring a bit to address this inflation.
Cigar Dojo doesn’t post any sort of explanation of their scoring system, or at least I wasn’t able to find it, so you have to either decide if it’s being based off of the Cigar Aficionado system or some customized version. In looking at the numbers, you’d see that the scoring is a bit inflated over CA and the score distribution is one of the most unique of the group. 90 is clearly the most frequent given score, but after that, there are 5 other scores that are very similar in frequency at second place. The range of scores is good as they are typically working in a 9 number range, which as small as that is, is more than a lot of the others in the group. Without really knowing what the scores are supposed to correlate to, it’s hard to say if they are inflated or not. If we are to assume that they should be compared to the CA scoring, then there is definitely some inflation present.
For The Cigar Authority, I also wasn’t able to track down any rating guide, so we’re left to make our own assumptions in regards to what the numbers mean. 92 is the most frequently handed out score with 91 just a single review behind. The scoring range is, again, very narrow. It’s a very tight four score range with a big drop off before other scores are considered. This is the lone site that, in the date range compared, has handed out more than a couple of scores in the 97+ range. Being left to our own assumptions, there’s definitely some inflation here. This is also the lone site that is attached to a retail operation and has a link right in the review to purchase from said retailer. Whether that plays into the inflation or not I can’t say, but I’m sure the inflation doesn’t hurt sales. I’m actually a bit surprised that Barry hasn’t made the unprecedented move of scoring a cigar 100+. That might actually do a bit of the work of re-centering where the scoring window should be.
As with a couple of the others, Tiny Tim’s Cigar World does not have a published rating guide, but from a response he made to the original article, he outlined that in his system, he uses the 90 score as the low score of defining what cigars he would buy again. Tim’s most frequent score given is a 92 with the numbers on each side of that being lower but equal to each other. In the 3 year window evaluated, only 26 out of 230 cigars scored less than 90, so it’s hard to tell if he likes most cigars or he’s only reviewing cigars that have a very high chance of being ones he’ll like. Either way, the scoring looks inflated from what we typically associate with the 100 point system norm. The scoring range is also fairly narrow, being in line with most of the others.
Final Thoughts
I have some ideas on why score inflation may be happening. And just to be clear, I’m not saying any of these reasons apply to any of these media outlets specifically.
- Gaining Publicity
- By handing out scores regularly in the 90+ range, it’s very likely that people are going to take note, especially brand owners and manufacturers who may share out your review to their audience. This can turn into increased notoriety and increased website traffic which can then turn into increased revenue from advertising.
- Praise
- Similar to the publicity reason, but less so on the monetary motivation and more on the recognition motivation of getting good feedback from manufacturers and brand owners and those who are fans of the brands.
- Not wanting to give bad feedback
- Some people have a hard time sharing opinions that are not good or can be seen as negative. It can be that way when you’re doing it for strangers, but even more so if it’s for people that you have met, have a relationship with or will possibly meet in the future.
- Not wanting to lose access
- Similar to not wanting to give poor feedback, this one increases the fear level a bit in that the reaction to a less than stellar review may cause you to lose access to face time with the brand, access to press releases, samples, etc.
So, what should happen with the 100 point system? Probably nothing. It’s pretty well accepted as it is with the inflation happening. If a cigar receives a 90 from an outlet that has an inflated scoring range, it’s not likely that the brand owner/manufacturer is going to look too much into how the score is derived or what it means from that source, they are going to take it for face value and be excited about it. On the flip side, if a cigar scores a 90 from an outlet that has an inflated scoring range and the brand owner/manufacturer gets super excited about the rating and sharing it out, the outlet isn’t likely to reach out to said brand owner/manufacturer and let them know that the number doesn’t really mean as much as they think it does. If both sides are happy with the results, then everyone wins, right? Well, if you’re the reader, and aren’t privy to any special rating system adjustments or the inflation, then maybe you don’t win as much.
What could happen with the 100 point system? In reading about this on the wine side, a topic that comes up from various people is everyone standardizing on a scoring system. We all know that ain’t going to happen in the wine world and it won’t be happening in the cigar world, so we should only look at realistic options.
- Re-calibrating the scale
- To get back to the spirit of how the 100 point system was derived, it would be nice to get back to where particular scores meant something and to do that, scores would need to ease back down to lower numbers. I understand that we live in a great time for cigars and tobacco and processes are better than ever, but it’s hard to believe that the majority of all cigars reviewed are “Very Good.” Plus, if the inflation continues, they are going to run out of range when they start bumping up against the 100 point ceiling. At that point they have two choices, raise the ceiling or re-calibrate the scale.
- Widening the range
- In the results from the graphs, it’s easy to see that most outlets are stuck in a 4-7 score range for the bulk of their reviews. With so many cigars having the same scores, how do you differentiate between them? It really would be helpful if you could get a feeling, even about small things, from a reviewer in how the cigars they review stack up for them.
In the end, this is really just me sharing my thoughts or complaining (however you choose to see it). I really enjoy reviewing cigars and a lot of what you’ve read here is why I wanted to use a different scoring system, to try to get away from a system that I didn’t think was working well and see if we could put something together that could express our thoughts in a different way.
2 comments
Join the conversationArless - November 30, 2017
Aaron, I appreciate what you are trying to do here and value what you and Jiunn have done with your site. I do read reviews from all of the sources you list. It does appear that there is a hesitancy to rate a cigar lower than 90 when it perhaps should be. Manufacturers can be sensitive (you can understand why- at ICPCR this year that was on display) but a reviewer ideally should be able to insulate themselves somewhat, if they are consistent across the brand reviewed. Not every cigar from a favored manufacturer is going to be a good or even great cigar. I suspect there are always going to be “good” “better” and “best” from any line of cigars. The reviews that are inflated/skewed upward don’t adequately reflect that and make it difficult to determined the difference as you stated in the article. This proves to be less than helpful to the consumers reading the magazine/blogs. I can see the point in just not publishing a rating if if is substantially lower than what a company typically puts out. If you think it is really a stinker, just don’t publish it. In general, I like the 100 pt system because it potentially allows for a comparison of a specific cigar that multiple site have reviewed. Unfortunately, that is often not possible when the value they give each characteristic is either not spelled out or is so different from each other. We get the BEST OF lists every year (which I really enjoy) but not a comparison of the rest.
Aaron Loomis - November 30, 2017
Thank you Arless, we really appreciate the feedback.