Brian Valentine, senior vice president of Microsoft Corp.'s Windows division, also serves as an executive sponsor of his firm's Trustworthy Computing security initiative. Valentine is responsible for making sure internal engineering process changes happen throughout the company. He also deals with Microsoft's enterprise corporate customers on security-related matters.
Computerworld recently spoke with Valentine about Microsoft's progress with its Trustworthy Computing initiative, which officially launched earlier this year. The initiative involved not only training and code review, but also the addition of accountability. Valentine said every line of Windows code now has an identified owner who is responsible for the security review of that code. Security milestones also have been added to the internal product development process, so that work cannot proceed until the checks are done.
Q: Can you update us on what's happened so far with Microsoft's Trustworthy Computing initiative?A: First, it was mandatory training during the month of January for every Windows engineer, actually every Windows employee.
Q: How many?
A: We've trained almost 10,000 people now inside of Microsoft. ... Through the Windows process, we did 8,000. I said that, including myself, every employee in the Windows division at Microsoft will be trained in this mandatory training on security by the end of January or you're no longer an employee in the Windows division.
Q: Can you describe the training?A: Oh, it's a full day, pretty intensive training session around -- first of all -- where common coding mistakes are made. ... It was developed by a person named Michael Howard, who's been my key kind of security penetration testing person within Microsoft. And he actually wrote a book called Writing Secure Code, and the course is really developed from the book.
There's a section on simple coding mistakes and how not to make them. ... Then there's a whole [section on] complex coding mistakes, which really go back to design issues and those type of things, and how to look at designs and do threat analysis of those designs. ...
Another one is how to change engineering [processes] to actually design for security upfront and then manage security as you develop your products. So what kind of milestones do you need and what kind of analysis do you need at what points in time? ... And then what should testers, the quality folks, think about in doing automated testing and automated things around testing for it? So it's a pretty intensive, internal engineering training course.
Q: Do you think a single day of training is enough?A: Absolutely not. That gets you started. ... Everything we learn and what people discovered in doing that security push that wasn't in the training, we're now going back for a second round of mandatory refresher course. ... As people popped up and started asking questions and learn[ing] things, we collected all of that and now built [it] into kind of the second refresher course.
And then there'll be a mandatory day of training every year from now until probably eternity because the issue doesn't go away.
Q: One of the first major vulnerabilities discovered in Windows XP related to the Universal Plug and Play (UPnP) feature. Several security experts questioned whether that feature should have been enabled in Windows. Do you think that would have happened today?A: Oh, absolutely not. ... I do believe something like the UPnP thing should not have happened with the appropriate education and training. And that goes to training and culture and everything else.
I also have the capability as a manager of a complex project like Windows that now every single line of code has an owner in Windows, an identified owner, that it was their responsibility during a security push to review that code, review that design. So if something comes up in the future, I can now go back to the owner and say, 'It was your job to find this problem. Why didn't you find it?' Not just to fire the person. I mean, if they blatantly didn't review it and said they reviewed it, then E there are conditions where that may be the case. But that's not the intention, right? The intention is that we can capture the knowledge of why we missed that issue and then roll it into, scale it to training. ...
I can use it as a positive reinforcement thing, too. I can reward people for doing good jobs because I now have ownership there too. ...
Then we can take and capture all that knowledge and not only share it within Microsoft. ... We've always intended it to be an industry enabling thing, so our research guys have developed some really cool code analysis tools that can scan for buffer overflows. It can scan for certain security issues in the code.
Q: In addition to mandatory training and assigning ownership of code, are there any other things you've done?A: Microsoft has a definition of an internal product development process, where there are design phases and milestones of development and betas and alphas and things like that. We've now worked into that design process security milestones. ... As you're developing a product, you get to a certain milestone, you can't go past that milestone unless you've signed off on, 'I have done the security code analysis at this level. I have done the threat analysis and the threat modeling at this level.' So we've worked in security into every milestone along the release. ...
I mean, Microsoft is a very crisis-oriented company. We do our best in crisis. That's just the type of environment we have. That's just what we like to do. If it's fun and easy and boring, we don't typically do so well with it. But if it's crisis mode, like when Bill [Gates] in '94 sort of said, 'We need to go after the Internet,' I mean, that's a crisis to the company.
Security I think is a very big crisis to the industry at this point in time. ... We are serious about this, as far as every product group.