When I heard that Internet Security Systems was bought by IBM, I thought, "Finally." I was, ironically, talking to an analyst friend (yes a do have a few of them) a few days before that about what I thought would happen to ISS. It was no secret that ISS had lost its way, for lack of a better term. It was a true pioneer in the industry and was responsible for making some of the staples of security programs available to the masses. But things had changed.
I was one of ISS's first customers, and I knew Tom Noonan and Chris Klaus soon after they began the company. I was happy for them when they became multimillionaires at their initial public offering (and mad that I had turned down a job with them years earlier). However, it's no secret that, at least as of a few years ago, something just wasn't right.
Some of their better long-term employees started to move on around then, always a warning sign for observers. But I saw more concrete evidence of trouble while I was reviewing potential business partners during my time at Hewlett-Packard, when an ISS sales rep with no clue as to who I was set up my visit as an introduction to security technologies and monitoring. While I am not delusional as to believing I am a household name, I have keynoted most major industry conferences and write extensively in the field. I've even been told that a well-known industry analyst, who is not a fan, sarcastically referred to me as one of the "one-named people in the industry." I'll take that as a compliment. If nothing else, my title at the time was chief security strategist for HP Consulting -- a clue that I didn't need to waste my time or theirs on why security monitoring was so useful. Do research? Check Google? Ask his boss? No, this guy was lazy, and he felt that was acceptable for ISS work.
The news also started me thinking about the reports and predictions for which analysts haven't been held accountable. For example, one analyst predicted a cyber Pearl Harbor by the end of 2003 or so. Hasn't happened. Gartner made lots of news by predicting that there would be some sort of public billion-dollar loss as a result of Y2k. Didn't happen (though as I mentioned in a previous column, that doesn't mean Y2k preparations were in vain). Likewise, if you review Gartner's so-called magic quadrants, you will see that many of the companies placed in the magic quadrant over the years have gone out of business or have been liquidated into other companies. Even Gartner backed off of its own report of the demise of intrusion-detection systems.
On some level, it's impressive that people care about what analysts think. They have visibility into many companies, and the companies pay them so that they are visible. This is important for buyers to understand. Companies want to get on the radar screen of analysts because a lot of IT executives pay a lot of attention to what the analysts have to say. Analysts are supposed to do the research for potential purchasers of technology. Buyers therefore pay analyst firms for their industry reports in order to cut down on the research that companies have to do.
The problem is that the vendors also pay the analyst firms for their advice. They want to know how they should in theory approach the market, and what their competitors are up to. A lot of vendors believe that they have to "buy the research" so that they are covered by the analyst firm's research and that potential buyers will read about them in it. The term conflict of interest comes to mind.
That conflict of interest might not be deliberately propagated, but the reality is that the analysts are extremely busy, and they have the time only to look into vendors that they already see because of the advice they are giving them. I actually appreciate analysts going out on a limb to make predictions where they can be proven wrong; it shows they're exhibiting some independence.
But companies -- the buyers of technologies -- need to truly understand the business of the analyst firms. Not all businesses work the same way, and some are more independent than others. It should be a regular business practice to look at the track records of analyst firms. If your company is going to change its purchasing decisions based on an analyst firm's reports or its "newsworthy" observations, doesn't it make sense to look at its track record?
For example, I know many CIOs who held up multimillion-dollar purchases of an intrusion-detection system after Gartner proclaimed the death of that technology. While the demonstrated power of one report is impressive, organizations delayed protecting themselves unnecessarily. Likewise, many vendors were sent scrambling, all because an analyst said something and the market reacted.
You didn't need analysts to predict that ISS would be acquired. You just had to talk to its customers and/or past employees -- a much better indication of the stability of a company than analyst reports. I realize that this information may be harder to come by, but the takeaway is that you have to consider analyst recommendations for what they are.
The security industry does owe thanks to Noonan and Klaus for accelerating the growth of the sector. Likewise, maybe someone will write a case study on the demise of ISS, and we can learn from their mistakes as well. Either way, the end of ISS is in many ways the end of an era.