http://www.businessweek.com/technology/content/may2003/tc20030520_4335_tc047.htm By Alex Salkever MAY 20, 2003 Mike Nash takes offense when people (like me) bash Redmond's software, because it's up to him to make it safe. Here's his defense Looking for a tough job? Try Mike Nash's. He's the executive in charge of Microsoft's "Trusted Computing" initiative charged with building more secure software. Nash has the unenviable task of taking on Redmond bashers who decry Microsoft's lax security. And he has to convince the world that, yes, Virginia, Microsoft products are secure. It may be ugly, but Nash has taken this one on with relish, leading brown-bag breakfast lectures on security topics at the Microsoft Campus and, by his account, infusing the new religion of security into every corner and cubicle of the software-development process. When I wrote a piece criticizing Microsoft's latest operating system as suffering from the cardinal sins of complexity and code bloat (both of which promote vulnerability) in late April, Nash called me to respond and give me his take on why Microsoft (MSFT ) products are a lot more secure these days and probably should be getting more credit (see BW Online, 4/29/03, "For Windows, Less Fat Means Fewer Bugs"). It was an interesting conversation, and he made some good points. So in the interest of fairness, this week's column will probe Nash's point of view on this hot topic. CAN'T KNOW IT ALL. The argument that code bloat means less secure software is based on simple mathematics. Complexity rises very quickly for each additional element added to any system or equation. With complexity comes vulnerability, because mere mortals -- even the brainiacs at Microsoft -- have a hard time wrapping their minds around millions of lines of code. In fact, no one disputes that it would be truly impossible right now for any human being to have an effective working knowledge of all the parts of a complex operating system, be it Windows, Unix, Linux, or Apple's OS X. The way society and business have thus far conquered complexity is through automation and process controls. Engineers don't have to use slide rules to calculate tensile strength required of cable strands on suspensions bridges because sophisticated computer programs do it for them. Likewise, space flight is possible only through rigorous processes that harness hundreds of individual minds into a disciplined whole. Nash argues that Microsoft is now using both approaches to mitigate complexity risks inherent in big software. And he says Microsoft is already producing much safer products because of new automated tools to spot bugs and better processes to emphasize software security. FLAW FINDERS. Here's a thumbnail sketch of his points. Over the last year, Microsoft has begun more extensive use of programs designed to automatically detect bugs in code. One is called PREfix, and the other is called PREfast. Both were developed by the folks at Microsoft Research, and both use complex algorithms to search for software constructions that will likely result in flaws. PREfix was first used to find problems in Windows 2000 code, but Nash claims it's greatly improved and has been used more proactively in more recent software-development processes. PREfast is a smaller, less complicated program that software developers can use to pinpoint obvious bugs. Microsoft developers employ it to vet their code, and Nash says outside developers are also starting to use PREfast. As for process, Microsoft's big initiative to train all its software engineers in secure software design got wide publicity. The company halted all code development in early 2002 for several weeks to put everyone through the training. Microsoft also changed its code-checking process. Rather than having dedicated security engineers come in after the fact and clean up the code -- a process that can be disruptive -- Microsoft chose to designate a lead security person for each component of the Windows source code. GIVE SERVER A CHANCE. "From a process perspective, we have provided accountability to all parts of the operating system. We created specific accountability system so that we know every component of source has a developer responsible for the security quality of that source," says Nash. Neither concept is entirely new. Researchers have been working on automated bug-fixing tools for years, although they're particularly hard to build because distinguishing bad code from good code is often a matter of taste and opinion, and closing one type of vulnerability can open another. Sun Microsystems (SUNW ), IBM (IBM ), and other big software shops have long followed similarly rigorous security mechanisms, and they've been rewarded with reputations for relatively strong security. Nash argues, however, that the people who bash Microsoft should give Windows 2003 Server a chance because it's the first generation to really reap full benefits from Redmond's new tools and process. MULTILAYER SECURITY. Microsoft has has made one giant step by eliminating one of its biggest security bugaboos: It now ships its latest operating system with many key capabilities turned off by default. That means Joe User booting up his system for the first time won't automatically launch Windows' built-in Web server and open his machine to the Internet at large without realizing it. More important, Nash claims that the Windows development team has built in precisely the type of multilayer security that hardcore security engineers have long advocated. That's no small feat. Nash points to a particular vulnerability reported in Microsoft Security Bulletin 03-007. This was a way that a malicious hacker could type in an extremely long URL into a publicly accessible portion of the Windows 2000 OS, crash the system, and possibly take it over. Microsoft issued a fix for that program some time ago. But as an academic exercise, Nash and his team examined Windows 2003 Server with regard to the vulnerability described in 03-007. They discovered that even had the flaw been present, Microsoft's secure development efforts had broken a series of weaknesses that would have allowed 03-007 to escalate to a much more serious level. That's because in Windows 2003, the input fields for URL commands for key programs used to share and develop Web sites are just 16 kilobytes, much shorter than the 64 kilobytes required to execute the 03-007 vulnerability. And the piece of software that allowed a hacker to take over the whole OS after exploiting 03-007 runs in a much more protected manner in Windows 2003. Translation? It's much harder to get the keys to the OS's inner sanctum, even if you can break in the front door. "It comes down to a realization that we need to do at multiple levels everything possible to make vulnerabilities go away so that even if they do exist, they don't cause a problem," says Nash. EXPOSED KERNAL? Of course, that's Nash's version. Some security experts who often critique Microsoft products have pointed out that Redmond's decision to move the Web server portion of Windows into the OS's "kernel" could be a recipe for disaster. The kernel is the inner core of the OS. As such, it controls the system's most basic functions. So hackers accessing the kernel by any means can usually exert significant control over the entire system. The security advocates worry that putting into the kernal a part of the OS that must interface with the public Internet means you're exposing at least one part of the kernel to all the hackers on the planet. Further, Microsoft remains in the security doghouse for its slow response in fixing some security holes. That may or may not be fair. Open-source advocates love to claim that many vulnerabilities reported in Linux and other open-source software are quickly tackled by the coder community and often fixed in a matter of hours. In many cases, that's true. But Jonathan Schwartz, the executive vice-president of Sun's software group, has told me that on several occasions when he has reported bugs to a prominent Linux vendor, it told him it didn't know when it would be able to get a fix out. Schwartz is hardly a disinterested observer, but I did see the e-mail exchanges, and, well, he certainly made his point. Worse than claims of tardy responses to some bugs is the bad reputation Microsoft has gotten for its own security patches that break systems. As a result, many Windows systems administrators take a wait-and-see attitude before applying a patch on the logical theory that a running system, even if it's vulnerable, is better than a crashed one that brings a company to a halt. BAD BUG COUNT. On balance, however, Nash has outlined extremely positive steps. But he says don't take his word for it. He points out that if you tally serious vulnerabilities, Microsoft fares better by some measures than either Linux or Sun. Take CERT Advisories. CERT is a federally funded computer-security research center at Carnegie Mellon University in Pittsburgh, and it functions as an information clearinghouse and early-warning system for the computer-security field. CERT advisories are used to call attention to significant threats to Internet infrastructure. Microsoft operating systems had 5 CERT advisories in 2002. Sun's Solaris OS had 12 during that same period, and Red Hat's Linux distribution -- the leading Linux variety -- also had 12. For all software products grouped by vendor, Microsoft had 7, and Sun had 13. All open-source software had 22. "You can find this on the CERT site," says Nash. "It's all there." One can certainly dislike Microsoft for its strong-armed marketing tactics. But it's getting harder to fault Gates & Co. for lack of effort on the security side. I'm still not convinced that so much code in one place won't ensure problematic vulnerabilities, but I do believe Microsoft is burning the midnight oil trying to prevent them from happening. - ISN is currently hosted by Attrition.org To unsubscribe email majordomoat_private with 'unsubscribe isn' in the BODY of the mail.
This archive was generated by hypermail 2b30 : Thu May 22 2003 - 01:15:13 PDT