I just read a report that concluded: "Most of the significant cyber-incidents . . . have had at their root cause defective and readily exploitable software code." This comes at a time when escalating threats, regulatory creep, increased software re-use and dramatic cost increases associated with a seemingly endless cycle of patches and vulnerabilities have made software quality, or surety, a critical issue.
Surety is the extent to which it can be made certain that a system attains the desired properties. Surety levels are associated with risk. High-surety systems are used for applications whose failure could lead to deaths or catastrophic financial loss. Medium-surety systems are used for applications whose failure could cause severe but survivable damages or losses. Low-surety systems are used for applications whose failure could cause lost productivity or minor damages.
Most commercial off-the-shelf (COTS) products are available only as low-surety offerings. Even most security software - such as domain controllers, automated patch management systems and most identity management systems - are low surety because the market values low cost, flexibility, backward-compatibility, features and schedule over security.
Vendors can do better. Well-known vulnerabilities such as buffer overflows can be largely eliminated using commercially available code-scanning services for as little as $US1 per line of code. Customers should demand at least this level of diligence from vendors. As an industry, we need to put more effort into researching and documenting secure development practices, training practitioners, changing licence terms to favour users over vendors and imposing consequences on vendors for software failures.
But let's face it: market economics aren't going to change altogether. And in the end, software surety can only get so far in the face of ever-increasing system complexity. Ten years from now, serious software quality issues will remain.
Practice basic security by following ISO 17799 and other control standards for management. Demand higher software quality from vendors that write applications for higher-surety environments or whose widespread operating system and security infrastructure software aggregate risk. Deploy a layered defence to protect against inevitable failures. Avoid the worst consequences, address complexity and balance security with other business needs by working through the issues using a systematic approach and your organization will come out OK.
Daniel Blum is senior vice president and research director with Burton Group