Computer Screen Code View

The IT community is obsessed with finding bugs in software and systems, and rightly so given the large number of network and data breaches reported this year alone.

But stamping out the bugs before they get written is just as important. That’s why the Institute of Electrical and Electronic Engineers (IEEE) has started the Computer Society Center for Secure Design, which is aimed at identifying common design flaws for software architects — not developers.

Their first effort is a list of the top 10 security design flaws and how to avoid them, which can be downloaded here.

Among those supporting the centre are Twitter, Google, Hewlett-Packard, EMC, RSA and Cigital, a Virginia consulting firm that specializes in software security.

According to a podcast with Cigital CTO Gary McGraw, one of the centre’s founders, these and other companies were invited to discuss the potential of the idea with a proviso: Anyone who showed up had to bring software from their own company with a flaw.

Note, McGraw said, that they weren’t to bring vulnerabilities they had found. “Bugs are pretty easy to find, there also easy to fix. Flaws on the other had are architectural problem at the white board level that you might not even find by looking.”

Google, for example, changed architecturally an API its developers regularly use to erase a cross-site scripting problem. “That’s the sort of stuff we’re looking to do — eradicate entire swaths of bugs to make it impossible for developers to create them in the first place. And to work on harder problems like how should you authenticate users, and what’s the relationship between authentication and authorization, and how do you use cryptography correctly?”

The design centre is one of the IEEE’s cybersecurity initiatives. Another is a workshop in November in New Orleans to look at the possibility of creating a software security “building code” for a narrow group of products: medical devices. Municipalities impose a code for physical safety of buildings and homes, McGraw noted. Perhaps a similar software building code is needed.

According to a workshop outline, the code might specify that modules written in a language that permits buffer overflows be subject to particular inspection or testing requirements, while modules written in type-safe languages might require a lesser degree of testing but a stronger inspection of components that translate the source language to executable form.

 

Previous articleFind a strategy for your IT strategy
Next articleTime to get more aggressive on cybersecurity?
Howard Solomon
Currently a freelance writer, I'm the former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, I've written for several of ITWC's sister publications including ITBusiness.ca and Computer Dealer News. Before that I was a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times. I can be reached at hsolomon [@] soloreporter.com

LEAVE A REPLY

Please enter your comment!
Please enter your name here