Microsoft hits anniversary

Bill Gates recently marked the one-year anniversary of Microsoft Corp.’s “trustworthy computing” initiative by sending out an e-mail commending his company’s progress to date.

The e-mail came just weeks before the SQL Slammer appeared, taking advantage of a vulnerability in Microsoft’s SQL 2000 Web servers.

The trustworthy computing initiative was launched by Microsoft just over a year ago as a recognition that it needed to do a better job in creating more-secure, less-buggy software. It was initiated through an internal e-mail sent out by Gates asking Microsoft employees to make security a priority when developing products.

“Trustworthy computing is intended to be a long-term initiative that will take a decade to realize,” said Rick Miller, a Microsoft spokesperson based in Redmond, Wash.

When people pick up the phone, they have no doubt that they’ll get a dial tone, Miller said, and that’s the same kind of reliability that Microsoft wants to bring to computing.

“Building security into software, particularly after the software has been constructed is not an overnight process,” said David Freund, an analyst with Nashua, N.H.-based consulting firm Illuminata.

It’ll take a lot of work, and Microsoft has produced many millions of lines of code that it has to go back and re-examine, he said. “That said, the company has made some good efforts in this area.”

The company has shown a willingness to notify users of security problems as they arrive, and to get patches out the door, Freund said. Subsequent releases have also shown a tendency to improve in their stability and reliability, he said.

The problem is there will always be a juggling act between creating applications that are secure versus those that are easy to access and use. “The design centre for most of Microsoft’s existence has been ease of use,” Freund added.

Another problem, which is by no means unique to Microsoft, is that companies have been rushing to get products out the door as quickly as possible. This was especially true during the dot-com bubble days, Freund said. Now both vendors and users alike are slowing down.

Some IT managers said they think Microsoft’s progress should be judged based on the number of vulnerabilities they see in future releases. But many customers may continue to use older products that haven’t been the focal point of Microsoft’s security push.

“In the short term, I’m resigned to an increasing cycle of patches and updates to existing systems that my already-overwhelmed technicians have to implement,” said Paul Lanham, senior vice-president and chief technology officer at Jones Apparel Group Inc. in Bristol, Pa.

Marc Maiffret, co-founder and chief hacking officer of eEye Digital Security Inc. in Aliso Viejo, Calif., said Microsoft should be devoting more attention to ridding its current products of vulnerabilities. “It seems like they’re much more worried about tomorrow, which they should be. But I think today is even more important,” he said.

Although the release date for Microsoft’s Windows Server 2003 (formerly .Net Server) was originally supposed to coincide with that of the desktop OS, Microsoft has delayed its release, and that shows the company is committed to its trustworthy computing initiative, Freund said.

Microsoft’s new approach to development consists of four main tenets – creating software that is secure by design, secure by deployment, secure by default and secure in communication, Miller said.

Secure by design means recognising that the company needs to do a better job in building products from the ground up. Secure by default means that whereas in the past the company shipped software with most of the functionality turned on, they are now shipping it in a lock-down state, in the securest form possible. It will now be up to sys admins to turn functions on rather than turn them off, Miller said.

Although the company hopes to eliminate as many errors as possible in the design phase, it recognizes that patches will always be needed. The secure by deployment part of the initiative means it will work on making sure patches are of good quality and readily available.

Code Red, Nimda and the new Slammer worm could all have been avoided if people had applied the patches that were out there, Miller said.

“That’s not in any way pushing the blame on system admins. We need to do a better job producing more quality and more seamless patches. But nevertheless, if you keep your system patched, then you’re not going to be vulnerable to attacks like this,” he said.

Microsoft recently got hit by the Slammer worm because it failed to patch some of its own internal-facing servers.

In terms of communication, the company said it is trying to keep its customers clearly informed of security problems as they arise.

The company only began its trustworthy computing initiative a year ago because historically computers have been isolated, Miller said.

“[Applictions] were built more for a standalone environment and so you had a computer that was running, and there wasn’t a need for security,” he said.

– With files from IDG News Service

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Related Tech News

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Featured Reads