This is from www.TrueDelta.com. Sign up with them if you haven't already.
Seven Serious Problems with Consumer Reports
I admire Consumer Reports as much as anyone. For decades they were the only source of vehicle reliability information. And even today they are the best source.
But even the best is not nearly good enough. In at least seven ways Consumer Reports' data collection methods or modes of presentation mislead or underinform consumers.
1. "Serious problems"
Consumer Reports' ratings are based on the number of "serious problems" reported by its members. I have searched in vain through their annual auto issues for a definition of what counts as a serious problem.
In contrast, TrueDelta will report measures like "times in the shop" and "days in the shop." These mean what they seem to mean. If a vehicle is in the shop for something other than routine maintenance, that's serious enough.
2. Relative ratings
Consumer Reports rates each model relative to the average vehicle. As a result, the absolute number of problems a vehicle will experience remains unclear. Does an "above average" vehicle "never break?" Is a "below average" vehicle "always in the shop?"
In the absence of hard numbers, people tend to assume that the best vehicles are better than they are and that the worst vehicles are worse than they are. I recently had a vigorous discussion with the owner of a Japanese SUV. As proof of his vehicle's superior reliability, he noted that it had been the highest rated brand in Consumer Reports' 2005 auto issue. This rating was based on 2004 vehicles, which were less (usually much less) than a year old at the time. His brand's cars had had eight "serious problems" per hundred vehicles. While this was less than half the eighteen problems per hundred domestic brand vehicles, the absolute difference was just one-tenth of a serious problem per car. Another implication: few (if any) vehicles are likely to have even one serious problem this early in their lives.
This did not--and does not--strike me as anything to get wound up over. The real problem: very few people who glance through the magazine think about the absolute numbers behind the relative ratings.
In contrast, TrueDelta will report absolute ratings in a form least likely to lead to misinterpretation.
Consumer Reports' rates models on a five-point scale from "much worse than average" to "much better than average" using their well-known red and black dots. More than half of domestic brand vehicle models earn an "average" rating, while many Hondas and Toyotas earn an "above average" rating. (With the average getting ever better, "much better than average" ratings have been becoming increasingly rare.)
"Average" means within twenty percent of the average, so 80 to 120 on an index with 100 being average. "Better than average" ranges from 121 to 140. So if one vehicle is "average" and another is "better than average," then the difference between them can range anywhere from a single point--totally insignificant--to 60 points--very significant. The red and black dots appear simple to understand, but they conceal far more than the convey. As a result, many readers of the magazine understand far less than they think they do.
In contrast, TrueDelta will clearly report the absolute differences between vehicles. For example, analysis of the data might find that one vehicle over the first five years of ownership will take 2.3 extra trips to the shop, for a total of 3.6 extra days.
4. Only averages
The reliability of all vehicles has been steadily improving. Currently, even the average eight-year-old domestic brand model is reported (on page 17 of the 2005 auto issue) to have fewer than one-and-a-half "serious problems" per year. Yet most people would not buy such a car because they fear it will have "lots of problems."
While perceptions are undoubtedly distorted by Consumer Reports' emphasis on relative ratings, another factor is likely involved: people are afraid of getting a lemon, an unusually troublesome car or truck. Even if the average is the same for two models, the chances of getting a lemon could be far higher for one than the other. People might fear that even as the average rate of problems for domestic vehicles comes down the odds of getting a lemon remain uncomfortably high.
Based on Consumer Reports' reported results there's no way to know one way or the other, as they only report averages. To my knowledge, they have never discussed the odds of getting an unusually good or bad example of a particular model.
In contrast, TrueDelta will report the odds of getting a lemon and the odds of getting a perfect car (in addition to reporting the average number of trips to the shop and days in the shop).
5. Survey (in)frequency
Consumer Reports sends out an annual survey asking people to report problems that occurred during the entire previous year. This is too long a period to expect people to accurately remember what happened.
In contrast, TrueDelta will send a monthly email asking people to report trips to the shop. In most cases participants will still only have to fill out one or two brief surveys a year. So the effort will be the same or less. But their recall will be much more accurate.
6. Stale information
Consumer Reports mails out surveys each spring, then reports the results in its annual auto issue the following spring. As a result, when a new vehicle is introduced in the fall its reliability isn't typically reported until a year-and-a-half later. This is a long time to wait for someone interested in a hot new design; by the time its reliability is known it will no longer be hot.
In a related issue, the vehicles reported on aren't as old as Consumer Reports suggests. For example, while "three-year-old vehicles" are, on average, three years old at the time the auto issue appears, they were only about two years old when the problems were reported, and only about one year old at the beginning of the period being reported upon.
In contrast, TrueDelta plans to update its ratings quarterly, and (given a large enough sample) will first report reliability four months after a new vehicle reaches dealers.
The last serious problem at least partially explains the others: Consumer Reports, once an innovator, has ceased to innovate. They have been reporting results much the same way for decades. The year-long lag between the surveys and the auto issue is likely an artifact of the past, when computers and the Internet were not around to speed the process. The same goes with continuing to rely on an annual survey.
Soon after I first conceived of a better way to study vehicle reliability I phoned the offices of Consumer Reports to freely offer my suggestions (though credit for them would have been welcome). The person who answered the phone responded that those in charge of Consumer Reports' vehicle reliability research were too busy to communicate with anyone from the general public. If I really wanted to make some suggestions (and this was not encouraged) I should type them up and mail them in.
Clearly I was speaking to a bureaucracy as ossified as any other. The only way this research was going to happen was if I conducted it myself.
If, like me, you want better reliability information, there will be only one way to get it free of charge--join the panel (via the gray buttons on the information page) and help make it happen.
Thanks for reading.
Michael Karesh, TrueDelta
First posted: September 5, 2005
Last updated: September 8, 2005