Testing is hard.
For a while there I thought of making that the end of this blog post, but I guess I should elaborate a little. Testing is hard, whether you are a vendor looking to do QA, an independent test lab doing competitive analysis, or an end-user trying to decide which product to buy.
Good test plans are difficult to draw up, and solid methodologies are difficult to create. End-users often use independent reports to create short-lists before doing their own in-house testing or proof-of-concept projects. This is why vendors get so upset when they don't do well in such reports. This is understandable, but what the vendor does next is often a good indicator of character.
The first thing to do, of course, is to verify that problems highlighted in the report are genuine. Vendors should work with the test lab wherever possible and be prepared to do so with an open mind, not get all defensive about the fact their precious product has a flaw. If the test lab can show you time and time again (live or on video) that they owned a target host protected by your product, then you probably have an issue that needs fixing!
Secondly, dedicate some resources to fixing the problem rather than generating marketing FUD to disguise it or deflect attention away from it. Yes, this costs money, whether you do it all in house or engage the test lab to help. Don't expect someone else to fix your product for free!
Third, bask in the glory that comes with fixing a problem quickly and professionally thus leaving your customers exposed for the minimum possible time.
What you SHOULDN'T do is shoot the messenger!
I have seen three examples recently of vendors going on the attack straight away when they don't like what is in an independent report - one in the IPS area, one in Web Application Scanning, and one in AV.
In each case the vendor in question launched public attacks on the various test labs, one of which led Mike Rothman of Securosis to predict the death of product reviews. I think Mike is wrong in this dire prediction, and end-users had better hope that I'm right, because such reviews - when done well - are all that stands between the purchaser and all that vendor hype. That and a Magic Quadrant!
Of course, the vendor is entitled to put forward his point of view. It is not difficult to spot weak methodologies, and these can do more harm than good, and the only recourse a vendor has to to refute the results publicly.
But when you have been caught out, when your product has been shown to have a repeatable flaw, posting falsehoods and ad hominem attacks in an attempt to discredit the report, the methodology, and the engineers who carried out the tests is simply not professional.
The problem is, if the test lab in question DIDN'T foul up the test, you are going to look pretty stupid when they are forced to reveal more and more of the problem in order to dispel your FUD attack. And your customers are going to be upset too, as you dedicate marketing resources to hide an issue better addressed by engineering resources.
If you are a customer of a vendor who engages in these tactics, I would encourage you to make every effort to talk to whoever produced the report which upset them. Try to understand the problem, and make sure that it doesn't affect you. If it DOES affect you, see if they can help you reproduce the tests in your own environment (if it is not too dangerous to do so). At that point you can go back to your vendor with some concrete data, and you will also be in a position to verify any fixes they release for the problem in the future.
I have a series of research notes in the pipeline right now on testing: what you should know, and how to do it properly. It strikes me they are sorely needed!
Wednesday, March 17, 2010
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment