Let's say you spot a house that's on fire, and you call the fire department. How long should it take the fire department to answer the phone? A few seconds? A minute? Or suppose you find a ticking bomb and call the police. How long should you have to wait on hold before you can report it? A minute? Two? Five?
How about a week?
Late last month, two security researchers proposed that when a software vendor is informed of a security hole, the vendor shouldhave seven days just to acknowledge being notified. Not to fix it or discuss it just to say, "I got your message."
And then, according to the "best practices" proposal submitted to the Internet Engineering Task Force (IETF), the vendor should fix the problem within 30 days that is, unless the vendor gets a "grace period extension" from the person reporting the problem.
Let's translate that into plain English: According to these best practices, vendors can spend a week pretending they haven't heard anything about a problem and then as much extra time beyond 30 days as they can get by arm-twisting, cajoling, browbeating or threatening whoever reported the problem.
Sure, that person can always go public with the information and risk being stomped into a greasy spot on the pavement by a platoon of the vendor's lawyers.
Best practice? It sure sounds good for vendors but not for anyone else.
Steve Christey of The Mitre Corp. and Chris Wysopal of @stake Inc. should know better. They're certainly not naive. Their proposal spells out the varying motives and options that both vendors and bug reporters may have (read it yourself at www.ietf.org/internet-drafts/draft-christey-wysopal-vuln-disclosure-00.txt).
So why give vendors a week just to check their mail? Especially considering that Wysopal's own company, @stake, has a policy on dealing with security holes that gives vendors just two business days to respond and two weeks to fix the problem?
Maybe Christey and Wysopal thought they needed a looser policy to make it more palatable as an IETF standard.
If that's what they were thinking, they weren't thinking it through.
A best practices document from the IETF won't be treated as the lower limit of what a vendor should do. It will be treated as the upper limit of what a vendor is required to do. That means this standard should be as tough as possible not watered downEven then, a best practices standard can't be used to force a vendor to do anything. But it can be pointed to by a vendor's lawyers if they claim that a bug-finder damaged the vendor's business by going public instead of granting endless extensions.
So because the IETF is a recognized standards body, a lame set of so-called best practices actually gives vendors one more thing to hide behind if they want to drag their feet and dodge responsibility or even threaten the people who find bugs.
If that's what this best practices standard does, we're better off without it.
Too many software vendors already treat a security hole mainly as a public relations problem. They dodge, they deny, they delay and all the while, IT people are the ones facing the consequences.
We need vendors to treat a security hole like a burning house or a time bomb that could go off at any second. In other words, like a real threat.
But that will only happen because of real pressure from customers and security experts and, yes, organizations like the IETF.
So if the IETF wants to define how to deal with security holes, let's just be sure it's lighting a fire under the people who should fix them and not letting the house burn down.
Frank Hayes, Computerworld US's senior news columnist, has covered IT for more than 20 years. Contact him at firstname.lastname@example.org.