Our intrusion-detection test published in June seems to have touched a raw nerve. Few responses were neutral, with most end users loving it and many vendors expressing . . . well, something other than love.
In the test, we found that seven commercial intrusion-detection systems (IDS) and the open source Snort package generated way too many false alarms and made it far too difficult to understand the security landscape.
At last count before press time, we had an amazing 56,722 page views on the story. That averages out to 873 page views per day with the banner day being July 11, when the good folks over at slashdot.com posted a link to the story, resulting in more than 23,000 visits during that 24-hour period.
Most of the fan mail was a variation on "I had the same problems. Thanks for showing I'm not crazy." Happy to provide the sanity check.
The other mail generally fell into one of the following categories:
You should have turned off false positives as soon as you identified them as such.
We tried. Sometimes even a severe reduction in the number of monitored events didn't help. In other cases, no instructions were available from the user interface, the documentation or even the vendor's support personnel.
However, I'm far from sure it was our job to disable the false positives. My coauthor Joel Snyder put it best: The state of the IDS art requires that "for every single attack signature, I should understand whether it is relevant to my network, which ports it should be looking on, and where I care that it look. . . . An IDS that makes me know all that is simply dumb; it's not doing its job."
You idiots don't understand IDSs.
Call me all the names you like, but leave my coauthors out of it. Rodney Thayer has done cryptography for 25 years, and he's written requests for comment on IP Security and OpenPGP. Snyder has trained legions of NetWorld+Interop attendees on firewalls and VPNs, he's an organizer of the VPNcon trade show, and he runs a sizable ISP to boot.
I was privileged to work with these two experts, but even they were surprised when we were criticized for evaluating IDSs from the perspective of enterprise network managers rather than security specialists. As Thayer put it, "There's no big 'for security experts only' label on any of these products."
And speaking of smarts, all three of us found the products lacking in that regard. The Code Red vulnerability of Microsoft Corp.'s Internet Information Server (IIS) offers a good case in point. None of the IDSs we tested could distinguish between a Code Red attack on a system not running IIS, a Code Red attack on a patched IIS system and a successful penetration. That's a basic design flaw in the products, not a reflection on users' security expertise.
Snort does have a graphical user interface (GUI), and we can't believe you misconfigured it.
Guilty as charged. Our intent in including command-line Snort was to provide a comparison in efficacy (not ease of use) between commercial and open source tools. On this score, Snort did quite well.
There are indeed numerous GUIs for Snort, including Acid, Demarc, Puresecure, Snare, Razorback and SnortSnarf. There's also Sourcefire, a commercial IDS appliance with Snort at its core.
We run Vendor X's IDS on our network, and it works fine. Your test was unrealistic.
While we're happy that you're pleased with your IDS, our test bed was as real as it gets. We put the IDS sensors on a live segment of an ISP carrying an average of 9M to 12M bit/sec traffic from corporate customers. That's not a huge traffic volume; we all have corporate clients whose networks use considerably more bandwidth. Further, the users were corporate customers, not a bunch of residential users downloading MP3 files. We also used a mix of popular operating systems on our "sacrificial lamb" machines.
"Real world" is a tricky and dangerous concept when it comes to testing. There's no such thing as a one-size-fits-all definition of "reality" that will satisfy all test cases.
We read your article with interest. We offer a widget that is absolutely guaranteed to avoid the kind of problems you encountered.
We'd like to hear more from you and validate your claim in future tests.
What should our next test look like? We plan to test these products "in the wild" in 2003 and welcome your input regarding test methodology. We'd like to hear from you - love letters or not.
Newman is president of Network Test in Westlake Village, Calif., an independent benchmarking and network design consultancy. He can be reached at dnewman@network test.com.