jump to example.com

After reading Kevin Pace’s Hot or Not? Consumer Reports post — and the great comments it’s generated — our friend Bob Markovich over at Consumer Reports dropped us a line to respond. Read on past the jump to read his note we received last week:

Bob’s note:

Let me begin by saying you’ve got a great site with some knowledgeable and passionate contributors. As Consumer Reports’ Home & Yard Editor, I spend much of my time with our testers and am typically the first to know what they’ve found out about the latest cordless drills, mowers, leaf blowers, and other homeowner tools. We’re also eager to find out what we’re doing right, how we can get better, and what we need to improve.

Several comments on our testing and reliability surveys went beyond simply tools. Because I’m also involved with planning our coverage for washers, grills, vacuums, deck stains, and a slew of other non-tool products, I thought I’d answer some broader questions, clear up a misconception or two, and tell you where we’re working to improve:

We don’t rate products based on price. Nor do we include our survey results about brand reliability in our Ratings performance scores, which are based purely on how models fared over weeks or months of testing (not just one “one week”). But we DO consider price and (when available) brand reliability for our Recommended and CR Best Buy models to highlight the ones that combine a good brand-repair history with the most performance and value. Our cordless-drill coverage (November 2008) included top picks for both general and heavier-duty use, with CR Best Buys in each category.

Long-term durability tests? Not usually, but we give products a workout. We do durability testing for treadmills and we test exterior paints for a full three years outdoors 24/7, since, for finishes, longer life is everything. We’ve also found ways to do life tests for car batteries and wear tests for flooring quickly and efficiently. But with today’s faster-paced model turnover, longer tests for some products would probably mean they’d be off the shelves by the time we were done. So we work to make our tests challenging yet real-world. For drills, that involves driving thousands of lag screws, among other tasks. Those tests have proven tough enough to kill “heavy-duty” lithium batteries, fry clutches, and melt the solder in contractor-grade models. Our continuous tests are also getting more tested models in continuously updated Ratings far more quickly.

Our brand-repair surveys aren’t self-selecting or warranty-based. We invite millions of subscribers via regular mail and online to answer questions designed to prevent skewed data based on a product’s age and usage as well as a brand’s market share. Data are based on models that were repaired or had a serious problem—not on warranty claims. Our minimum sample size is 100 cases for each full year in the analysis (most cover at least 3 years). We also verify that no one is stuffing the ballot box with multiple questionnaires. In contrast, user reviews ARE self-selecting and tend to reflect extremes on the satisfaction scale—and you don’t always know whether the writer is an actual consumer or a company shill. But those reviews are an increasingly relevant leg on the stool, which is why we’re looking to grow them on our site, with the above caveats.

We do disclose how we test, but there’s room for more. We include a Guide To The Ratings right next to the chart in Consumer Reports magazine product reports and on Consumer Reports online. We’ve begun highlighting our tests and testers far more in our reports and are working on more online test videos aimed at the newest and/or most touted products. As for which features are most important, we include the ones readers tell us they want, among others. For drills, those include two batteries and quick charging, multiple speed ranges, an adjustable clutch, and a ½-inch chuck. We could do more on heavy-duty features and how they figure into pricing. But as we’ve seen, those don’t always guarantee a better drill just as “pro-style” doesn’t guarantee a better stove.

And of course, our buy-what-we-test, take-no-ads policy makes it easier to call things as we see them. But there’s always room for improvement. I hope this helps.

Whether Bob’s response changes your “hot,” “not,” or “warm” opinion, it does prove one fact: Consumer Reports reads Toolmonger and heard your cry.

 

9 Responses to Consumer Reports Responds

  1. tooldork says:

    “We don’t rate products based on price.”

    I respectfully disagree.

    I did analysis on their ratings for composite decking products and translated ratings to numeric values to try to dissect the overall rating value.

    10 = excellent
    9 = very good
    8 = good
    7 = fair
    6= poor

    Based on ratings, not one product equaled the overall score for their product. But, when comparing two products with equal rating scores, those with lower ratings scores, that were priced lower, were bumped up to reach a higher overall score.

  2. Jim German says:

    “exterior paints for a full three years outdoors 24/7” Ohh well thats great to hear, since I know I don’t bring my siding in at night or on the weekends.
    *eyeroll*

    Until Consumer Reports gives a detailed breakdown on how they score products (something at least as good as car and driver does for their scores), their ratings will be pretty much useless.

  3. Barri says:

    I think the best way to find out the quality of a product is GOOGLE it. You will find more honest reviews of products than you could ever find in a mag. If a high % of reviews are good you know your buying a good product. Simple and it has worked for me for over 10 years.

  4. Zathrus says:

    @tooldork:

    Which proves what? Nothing… it merely means that the full scoring strategy is not printed and that your attempt to “decode” it was a failure. They do not print the full scoring methodology or break it out on individual products. The bubbles are not meant to be a hard and fast number, but a general indicator of how a product performed in that area. One “excellent” is not equivalent to another “excellent”.

    @Jim German:

    Given, it’s a duh, but what other publications or review sites do anything even close? The real issue is that by the time the 3 years are done and the report is published you have no way of getting the exact formulation that was tested… and from looking at year-to-year tests, it’s pretty common that a paint/stain that rated top one year will be average or even below average the next. Some are repeatedly in the top quarter though, so you can pick any of those and feel pretty good about it.

    And to the original article:
    “Our brand-repair surveys aren’t self-selecting or warranty-based. We invite millions of subscribers via regular mail and online to answer questions”

    Oops. You’re right — I blew the definition of a “self selected survey” there. The real question, however, is what is the response rate on your solicitations to participate?

  5. Aaron J. says:

    Wanted to add to the above comment on using the ever popular Google.

    Try searching for the product with the work sucks in the form of a phrase such as:

    the dremel tool sucks, Craftsman sucks, lowes sucks, toolmonger sucks 😉

    You can probably find someone using that phrase for even the best products out there but the real score here is that it filters out the dozens of portal sites that come up with you try and find ratings and reviews.
    You know, the ones that just have reviews of the web store selling the product.
    And obviously if you find lots and lots of people complaining about your product that’s gotta tell you something.

  6. Dexm says:

    I like what the guy from CR wrote here. I think it’s a salient observation on reviews found on the net:

    >> We also verify that no one is stuffing the ballot box with multiple questionnaires. >> In contrast, user reviews ARE self-selecting and tend to reflect extremes on the >> satisfaction scale—and you don’t always know whether the writer is an actual
    >> consumer or a company shill. But those reviews are an increasingly relevant leg
    >> on the stool, which is why we’re looking to grow them on our site, with the above >> caveats

  7. frankoamerican says:

    The take-away message about CR testing paints outdoors 24/7 is that it is real world testing–not simulated in a laboratory environmental chamber.

    For what it’s worth, and I swear on a stack of woodworking magazines that I’m not a shill from CR but a chemist, the testing they do is first rate. Their experimental procedures and attention to detail is impressive. Remember, though, that there’s aways a critic, no matter what you do.

  8. Bob Markovich says:

    Appreciate all the comments on my comments. One other word about one comment that insists our overall scores include a price or value quotient: They don’t. Period. But, yes, our picks and certainly our CR Best Buys take price and reliability into account, as they should.

  9. markwlewis says:

    Look, Consumer Reports is not perfect. They know it. these days companies can make a perfectly fine product one year and a pizzakrap the next. Example: I have been working on a dryer for weeks and months.

    It is a Kenmore/Whirlpool. It has a solid state control panel that was (gently put) problematic. (Less gently, a poorly engineered pizzacrap.) I am not sure how CR testing could have seen this coming as it has taken years for people to realize what a pizzacrap it really is and the large pool of machines that are covered by that hard fact! (You can do a search if you are curious.)

    Noteworthy NOW at this late date is that most every dryer (not in the pricing stratosphere) has MOSTLY MECHANICAL controls. I kid you not, I have seen a hundred or more dryers 800 dollars and below and NONE had a single integrated control panel (one piece) like those multitude of machines that were produced for 3-4 years. No shorter-term testing would point that out, and CR could not be blamed for that. SO the tyranny of the masses still has a VALUE, but not so much in immediate purchase feedback. I have had people giving top ratings to durable goods when they had them LESS THAN A WEEK!

    Ummm, thanks but no, you can’t tell me what a terrific vacuum a certain model is in one week without qualifying it by saying “Of course in a month I may think it is a pizzacrap when it stops working 3 times and breaks the belt twice.”

    But, the ratings are still compiled, say on Amazon, as if this review of a “real user” is an across the board endorsement, “Hey, these people think it deserves 5 stars!” This info in valid, but most so when combined with a more methodical approach like testing something to the point of failure, which, honestly, I ain’t doing, at least not on purpose (lol).

    That is what CA does and I appreciate getting both opinions. Even the historical reliability rating help, but cannot be counted on as the last word. The company that got low marks last year, may have been absorbed by a BETTER QUALITY competitor and now be making products of increased quality! (Let me have my DREAM, you jackals!)

Leave a Reply

Your email address will not be published. Required fields are marked *