10 Recommendations for Avoiding Software Security Design Flaws

The new IEEE Center for Secure Design issues a report from the experts

7 November 2014

It seems like not a week goes by without a major company reporting a massive computer security breach. About half of all security breaches are made easier because of bugs in the software’s implementation—that is, the overall design appears perfectly fine, but some aspect of its implementation fails, causing security vulnerabilities. Flaws in the software’s architecture and design cause the other half.

The security industry has been focused mostly on finding and eradicating bugs, but attackers can also exploit design flaws, which can be less noticeable and just as harmful. It’s easier to discover bugs because there are plenty of lists that provide the most common ones in development, but there isn’t much reference material on how to avoid security design flaws. That is, until now.

The new IEEE Center for Secure Design (CSD) recently released a report of the top 10 most widely and frequently occurring software security design flaws and recommendations for how to avoid them. The software security experts called upon come from member organizations that formed the center, including Athens University of Economics and Business, Cigital, EMC, Google, Harvard, Twitter, and the University of Washington.

The establishment of CSD results from a workshop with the security experts held in April by the new IEEE Cybersecurity Initiative, which was established in January by the IEEE Computer Society. The initiative will tackle various aspects of security, including those in computer security education and a building code for security-critical software.

IEEE Senior Member Gary McGraw, chief technology officer at the software security consulting company Cigital, says, “We believe there has been too much focus on common bugs and not enough on secure design and the avoidance of flaws. This is worrying because design flaws account for 50 percent of software security issues.”

“The Center for Secure Design will play a key role in refocusing software security on some of the most challenging design problems in security,” adds IEEE Computer Society member Neil Daswani, the security engineering technical lead at Twitter. “By putting focus on security design and not just on implementation bugs in code, the CSD does even the most advanced companies a huge service.”

BUGS VERSUS FLAWS

Bugs and flaws are both, of course, defects. A bug is generally caused by a failure in implementation; the overall design could appear perfectly fine, but some aspect of its implementation fails, resulting in security vulnerabilities. For example, a programmer makes an error that results in a bug in the software’s source code.

“It is important to realize that many classes of bugs are actually manifestations of design-level issues,” said Christoph Kern, an information security engineer with Google, in a podcast about the new center. “If you look at classes of bugs in the bigger picture, they can happen all over the place in a given large piece of program.”

A flaw, on the other hand, is a defect in the design of a system that creates a vulnerability that can be attacked, and the only way to fix it is to change the architecture and reimplement the program according to the corrected design. Flaws are often much more subtle than simply an off-by-one error in an array reference or the use of an incorrect system call. A flaw is certainly instantiated in software code, but it is also present (or absent) at the design level, according to Kern. 

COMPILING THE LIST

Daswani, Kern, and McGraw were among attendees at that April workshop that brought together software security experts from industry, academia, and government to address the problem of secure design. To have a secure design is to have a system that supports and enforces a system’s authentication, authorization, confidentiality, data integrity, accountability, and availability, even when it is under attack.

The experts were asked to develop a list of what they believed were the top flaws in the design of a software system. Many cataloged defects that have been well known for decades but continue to persist.

“I think having this list gives users at least a starting point to ask the right questions,” said Kern in the podcast. “Certainly, for smaller organizations without a mature in-house security program, it gives them something to start with.”

According to Kathy Clark-Fisher, manager of the cybersecurity initiative, this practical document can be used by software design architects in their daily work to review their design and check whether they are on the right path.

“We wanted to put out a positive document,” she says. “Instead of telling architects what not to do, we wanted to give them guidance on what to think about doing.”

She noted that a senior systems architect at Symantec, which makes security, storage, and backup and availability software, found the report so compelling and practical that he turned it into an instructor-led training program for his staff.

“That’s exactly the kind of goal we hope to achieve not only for the center but also for the overall initiative,” she says. “If organizations design more secure systems, they can significantly reduce the number and impact of security breaches.”

Learn More