X
Business

When product reviews fail: An inside look at how we test DIY tech

If you've ever wondered about the product review process, particularly how products are included and evaluated in DIY-IT projects, this should interest you.
Written by David Gewirtz, Senior Contributing Editor
exploding-keyboard-tech.jpg

Pavel Šeplavý, Getty Images/iStockphoto

A lot of what you'll read here in ZDNet's DIY-IT column revolves around projects. We dig into a bunch of tech topics to build, explore, discover, and understand how stuff works, what it can do, and how we can mashup all these things into a working solution.

At the core of these projects are the products that drive them. As many of you loyal readers know, I often dive deep into products to help you understand what makes them tick. One of the most common questions I get -- from both readers and vendors -- is how I approach reviewing products.

When I look at a product, it's always with the goal of learning and discovering what it can do, not with rating it on a scale. This is a very different approach from most traditional product reviews, and it's why you'll rarely see a numerical score or a bunch of stars next to a product I evaluate.

It's not that my approach is better. I can't tell you how many Amazon products I've bought based on the number of stars. Instead, most products I look at are intended to fit into a project, or an area of exploration, rather than stand on their own.

For example, many of you have read our 3D Printing Discovery Series. When I started that series, I had never even seen a 3D printer, so rating one would have been a supreme act of hubris. Now, 30 articles or so later, I know a lot about what to look for in a 3D printer. I still won't give them star ratings, but I can tell you that if you want to build a certain kind of project, you should look for a certain kind of feature in a printer.

Another example is my exploration of storage. My daily work practice has necessitated setting up a relatively vast array of storage, and it expands constantly. I need to store and evaluate a tremendous amount of test data and telemetry. I manage a huge media asset library, and a growing library of 4K video. I have a couple hundred terabytes of storage here at Camp David.

Given my interest in storage solutions, I've started bringing in lots of different devices. When I started to dig into Synology's NAS, I didn't just write a quick review. Instead I wrote three detailed articles about three different aspects that differentiated their offering.

There are a bunch of new NAS devices on the way. I'll be able to look at them, and relate them all together, so you can think about the criteria by which you choose a device.

Our story begins

To illustrate the review process, and what it means for you, we'll use a failed product review as an example. After all, one of the best ways to learn is to explore our failures.

Our story begins with an intriguing storage device, a very small SSD RAID box that you could carry around with your laptop and not notice the extra heft.

This versatile device was a direct-attached storage RAID box, with the ability to choose between mirroring, striping, and individual drive access. I was particularly curious to see if striping mattered with SSDs, especially over USB, so I accepted the vendor's offer of a review unit.

Striping was originally developed to write to the spinning platters of each drive alternately, to reduce the wait time as the platter made a full revolution. Given that SSDs don't have spinning latency, I really wanted to know if there was any benefit to striping.

The problem was, I couldn't get the drive to mount. I thought it might have been the RAID settings, but after quite a few back-and-forth emails to the vendor, we still didn't have an answer. Eventually, the company sent me a new drive, and I shipped the original machine back to them.

The replacement drive did mount. I was able to configure it in all three RAID modes without difficulty. But when I did some speed testing, I noticed that I was getting, barely, USB 2.0 speeds. The drive was dog slow.

Now here's where it gets interesting. The drive is a USB-C device. It was originally sent when I thought I'd have a new MacBook 2016 notebook. When I cancelled that order, I discussed whether I should continue the review with the vendor. The vendor assured me that their USB-C to USB 3.0 cable would work, and performance should be roughly the same.

It wasn't. As I mentioned, I was getting really bad speed results. I tried the drive on number of different Macs, just to be sure. Still, there were bad results.

So, once again, I contacted the vendor. They insisted they were getting much better results with the same configuration. We continued to do some testing, but I still got the terrible results.

It was at this time I decided to return the drive. Even though I'd put in a lot of time, and had already done a photo shoot with the product, I decided to skip doing a review.

At this point, many of you might be asking why I didn't just do the review, write some scathing commentary about why the drive sucks, and save consumers from buying this device?

The answer to that is complex, but important.

More great project ideas

First, I can't be absolutely sure that the product is bad. Sometimes, I can definitely ascertain that a product shouldn't be on the market, but that wasn't the case this time. As I recently wrote, I've been having a bunch of problems with Sierra. I did not have enough confidence in the failure point to blame the vendor.

However, I feel reasonably safe telling you that trying to run a USB-C device on a USB 3.0 interface (or vice versa) is probably not as turnkey reliable as vendors have implied. As you move to USB-C, keep in mind that there very well may be the odd adapter problem here and there.

Kindness to manufacturers

The second answer to why I didn't write a scathing review is my long-held policy of what I call "kindness to manufacturers." A lot of reviewers love going for the jugular. It generates traffic, and some writers believe that to be a reviewer you have to be a harsh critic. That's not my style.

Part of the reason is that I was a product developer and product marketing guy for many years. I know just how incredibly hard it is to bring a product to market. I also know how devastating it can be to have a brutal review, especially if there's a workable approach that wouldn't have resulted in that review. I'll talk more about this in a minute.

In the case of the SSD RAID drive, I was approached by the manufacturer. I've reviewed other products of theirs before; the company has been around for years. Their products have been well received by users and reviewers alike. The company is ethical and well-respected.

I'm telling you this because I always take the vendor's background into consideration when looking at a product. I'll think differently about a vendor I've known for years than I will someone who is new to the scene. Vendors who have sterling reputations are expected to meet a different bar than those who are just starting out.

Think about Photoshop, for example. A discussion of a new Photoshop release will have a very different tenor than a discussion of a new photo editing app from a new startup. Both are worthy of exploration, but Photoshop is an old friend. We know its ins and outs, so how we describe what it can do, or what a new release adds, will be different from some neat new tech from the new startup's app.

Trustworthiness comes into play as well. Reviewers know (or, at least they should know) that what they say can impact millions of dollars in sales. Reviews can increase the confidence of buyers and thereby drive sales, or they can have a negative impact, potentially hurting a company's long-term prospects.

When negative coverage is necessary

I had an experience last fall with a company, TrackR, that did not conduct itself ethically. I bought a product, it was not delivered, and the vendor misrepresented its delivery status over a period of months. When I could not get either the product or a refund, I wrote a detailed story about the situation.

I did this because I felt readers needed to be protected from the company. But here's the thing: the company also needed to learn to improve its systems. I had a very nice discussion with TrackR's founder after the article came out, because they didn't return emails before.

From my discussion, I got the genuine impression that the company wasn't trying to rip customers off. Instead, it seemed much more like TrackR simply was a new company and hadn't properly built out its internal systems. Yes, that resulted in a blistering article, but it may also have been the wake up call to drive improvement into the company's DNA. We'll see.

Fairness is important, so I updated the original article with a letter from TrackR's CEO, indicating that the situation had been addressed and measures had been taken to improve the company's systems. Negative coverage ended with a positive conclusion for everyone.

Products that fail

I experience a lot of product failures, both in products I buy and in products provided to me by the vendors. Sometimes it's because the product isn't ready for prime time. Sometimes it's because I'm trying to do something that pushes the limits of product performance.

Here's an example of the second case. When I got my iMac a few years ago, I knew I wanted to add a bunch of monitors to it. I knew I could run two or three, but I wanted to take it all the way to six. To do that, I used a variety of interface devices.

When I hit a limit at four displays, it wouldn't have been right to blame the interface products. I was just pushing my luck. That's what experimentation and exploration is about. A big part of the DIY-IT ethos is questioning just how much we can get out of our gear.

Other products fail because they're just not ready, at least for the performance or production use I want to put them through. In the fall, I brought two video broadcasting products for use in my broadband studio project.

Both had just had major version upgrades -- and both failed in one way or another. You know you're in for a bit of fun when a product manager tells you, "Uh, it's not supposed to do that."

Since I couldn't get them working for my project, I didn't write about them. It wasn't appropriate to write a negative review because I hadn't tested all the use cases. Instead, I tried to incorporate them in a project. They didn't work in that context, and I moved on.

Unfortunately, this means I spend a lot of time wrangling vendor relationships, testing products, and sometimes not having anything to show for it. But all that work, whether published or not, helps me understand more, and helps me provide you with guidance on what works and where you might be pushing your luck.

For example, it's entirely possible that one reason one of those video products didn't quite work as well was because they recommended a Mac Pro, and I was trying to shoehorn it all into a Mac mini. It would have been cool if it worked. The project I was doing didn't justify a $7,000 computer, but it also wasn't their fault that their product didn't meet my standards. It also didn't justify a negative review.

Other questions I'm often asked

This article is getting long, so I need to wrap it up. Before I do, let's do a quick lightning round of questions I'm often asked.

Do you get paid to write positive reviews? I don't get paid by vendors at all. I work for ZDNet and they pay me, without regard to the products at all.

Do you get all these products for free? Depends. I buy a lot of what I use, but there's no way I'd be able to buy, or prioritize a purchase to explore, everything I write about.

Do you contact the vendors or do they contact you? Both. For a big 3D printing project I'm doing, I asked a filament vendor if they'd donate some filament. For the RAID SSD (and a bunch of upcoming NAS articles), I was contacted by the manufacturer.

Do you always write long reviews of products? No. Some get included in gift lists and galleries. Some products that have a lot of exploration potential get explored in depth.

Do you guarantee good reviews to get the products? No. I make it clear to vendors that there's no guarantee, and that unless I think a company is being predatory, I'll tell them why their product isn't ready for prime time rather than beat on them unnecessarily.

What do you do with the products once you review them? Some become part of an ongoing series, like the 3D printers. I'm doing lots of comparisons. Some are returned to their vendors. Some are donated.

Do vendors ever complain after a review? Not too often. But companies do reach out if they've fixed something, want clarification for their internal processes, or to ask me to correct something if I've genuinely made an error (which does happen, thankfully rarely). I've spoken with the leaders of some of the most amazing companies.

Why don't you give star ratings? One reason I don't do star or numerical reviews is that vendors get very fussy about a 4-vs-5 star review when the "grade" itself doesn't matter. I'd rather talk about how to think about the product than try to quantify its value.

How can I get free review products, too? Start by writing reviews of products you have, publish those on your blog, and work your way up to writing for a recognized outlet. Check into Amazon's review programs. Or take the YouTube route and build an audience.

If I want something reviewed, can you do it? If you're a vendor, your best bet is to email me, if your product or service is interesting, worth exploring, and fits the DIY-IT theme. If you're a reader and want to know more about a product, let me know.

What's the weirdest product you were ever sent? I've been sent a lot of unsolicited items that just landed at my office door, but the weirdest had to be an iPod-driven sex toy. That one got donated, unopened, to the local Goodwill. I'm sure hilarity ensued.

That's a wrap, folks. Keep reaching out through Twitter, Facebook, YouTube, the comments below and the email form provided by ZDNet. Let's keep exploring, learning, and discovering together.

You can follow my day-to-day project updates on social media. Be sure to follow me on Twitter at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.

DIY-IT: First look at the Overlord Pro+ 3D printer

Editorial standards