The secret to secure code--stop repeating old mistakes

The secret to secure code--stop repeating old mistakes

Summary: The game of leapfrog between code defenders and attackers in cyberspace will continue to be waged as long as code is written. But the majority of security holes could be eliminated.

TOPICS: Security

The game of leapfrog between code defenders and attackers in cyberspace will continue to be waged as long as code is written. But the majority of security holes could be eliminated. It's not a technology prowess issue. The problem is that most companies and programmers don't have the basic skills or knowledge necessary to build security into their products from the ground up.

This was the distilled knowledge after spending a few hours last week with the Fortify Technical Advisory Board, which included a who's who of cybersecurity experts. Fortify primarily develops and sells static code analysis tools for programmers. The advisory board group, pictured below, was universal in pinning the majority of security woes on poor programming practices and a lack of education and awareness.

"Every time we move into something new, we repeat the old mistakes," said Avi Rubin, professor of computer science at Johns Hopkins University. "We saw it with RFID and voting systems and now the Web. Windows 3.1 systems and other older system are safer because modern virus won't work on them."

Li Gong, the primary architect for the Java security model while he was at Sun and most recently the managing director for Windows Live China, echoed Rubin's comments.

"The industry is currently defaulting to a small number of platforms--Windows, Java and a few others. Once the platform is built it is hard to make it more secure. You only get one or two chances to make it more secure, especially once it ships. Because its layers, you have to solve the security problems at each layer. The problem is that people repeat the same mistakes every time they create something new, such as with the Web or AJAX. They forget about lessons learned in the past."

  • Back row: David Wagner, University of California, Berkeley, professor and a top software security researcher; Brian Chess, chief scientist at Fortify; Bill Pugh, University of Maryland professor of computer science and founder of the open source tool, Find Bugs; Marcus Ranum, inventor of the proxy firewall and expert on security system design and implementation; Li Gong, primary architect for the Java security model at Sun and most recently managing director for Windows Live China.
  • Middle row: Matt Bishop, University of California, Davis, professor of computer science and  author of “Computer Security: Art and Science”; Avi Rubin, professor of computer science at Johns Hopkins University and author of "Brave New Ballot: The Battle to Safeguard Democracy in the Age of Electronic Voting" 
  • Front row: Fred Schneider, Cornell University professor of computer science, Director of Cornell’s Information Assurance Institute and member of the technical advisory boards of Intel and Microsoft; Gary McGraw, CTO of Cigital, and the author of “Software Security”, “Exploiting Software” and “Building Secure Software"; Greg Morrisett, professor of computer science at Harvard University ; Roger Thornton, CTO of Fortify.

"When Google started to put stuff on peoples' desktops, they lost control of security, said Gary McGraw, CTO of Cigital and author of several popular books on security. "The real interesting aspect of Web 2.0 has to do with where the trust boundary ought to be drawn. Some of the stuff is outside trust boundary."

A lack of expert security architects as part of development teams was also cited as contributing to flawed code being released into the world. It's a systemic issue, a whole mentality problem," Gong said. "You don't have amateur architects building bridges," Rubin added.

"You won't solve the security problem by adding a few more security experts and all other programmers remain ignorant," advised David Wagner, a top software researcher and computer science professor at the University of California, Berkeley.

"Two things are going on. There is the issue of security and the issue of good coding practices. They are interlinked. Everyone has to use best practices--the chain is only as strong as weakest link," said Matt Bishop, professor of computer science at the University of California, Davis. 

Bill Pugh, professor of computer science at the University of Maryland, said, "Most programmers are ignorant of security."  Student study textbooks, but a static analysis tool can provide a more poignant lesson, he added, noting that the textbooks need to be updated. "By the time someone graduates from college it's too late," Gong claimed. Learning security practices on the fly is not the best way to create secure software.

The experts also discussed security as a holistic discipline. "There is a misconception that security is a operating system, language or network problem. Security affects the whole system. These days we are getting better at layers, but now [attackers] are hitting more at the application layer, but you can't fix it by upgrading to Vista," said Wagner. 

As the bottom layers have become more secure, applications have become more interesting targets to attacker code.

Marcus Ranum, inventor of the proxy firewall, referenced Larry Wall, the creator of Perl, on programmers: "He likes to say that there are two defining characteristics of great programmers--laziness and hubris. That is what got us to where we are. People putting their lives on the line with code is crap. The only part of 'software engineering' is in the word 'engineering.' You have to get the end customer to understand that there is a safety issue."

Fortify's Brian Chess noted that software developers make predictable mistakes and that tools can look for problems and take away the "low hanging fruit" from unsophisticated hackers.

However, a huge potential downside is that the only people who have time to look in obscure parts of code are those who want to do harm, Rubin said. 

"Cigital and Fortify have a joint customer, and it uses Fortify to look at millions of lines of code. They keep track of what they find and then teach programmers. It's cheaper to instruct than have to find bugs. Software security initiatives, like those at Microsoft, Qualcomm, and many banks, have to close the loop and have way of measuring success," McGraw added.

Despite the weekly array of security breaches identified in Windows products, the security experts gave Microsoft kudos for its security initiatives. "The temptation is to beat up on Microsoft, but the company is not unique and in many ways is one of the best to integrate security into its development process," said Wagner, who is well known for discovering serious flaws in popular software.

The SCADA (Supervisory Control And Data Acquisition) systems that run energy, water and other critical infrastructure, however, didn't get kudos from the security experts. "Every time I look at the way a SCADA system is constructed, I fall on the floor laughing with my feet kicking or beat my head against the wall," said Ranum. "The SCADA systems are worse than you can imagine. Five years ago, a publicly accessible site was controling a console for pumping systems in Middle East. It was frightening beyond belief. The answer is don’t plug that into the Internet or your own internal network."

Referring to software development in general, Fred Schneider concluded: "This is the software correctness problem. If you can't get software correct, then you are fielding something that can be abused. In the end the software correctness problem is having to articulate what you want the system to do--no one will ever formalize that and the buck will stop there. There will be problems as long as there is software."

Topic: Security

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Thank you for such

    a great and well documented article.

  • hardware

    The article appears to be "pinning the majority of security woes on poor programming practices and a lack of education and awareness."

    I disagree with this conclusion. Any solution that simply expects all programmers to start writing bullet-proof code when they were unable to do so the day before is bound to fail. These experts need to stand back from the problem a bit further and expand the universe of solutions to include hardware. Most of those bits do NOT need to change every month. Most people could use MS Word from five years back and be perfectly happy. So why not physically protect those bits via hardware? If your only solution is to fight software with software, you can never win. The badguys are better paid.

  • And impatience

    There are three pillars of programming, not two - laziness, hubris and *impatience*.

    Also, I think Randall Schwartz came up with that line, not Larry Wall. It was in the original O'Reilly Perl book.

    Correct me if I'm wrong...
  • To paraphrase "The Graduate"

    Mr. McGuire: I want to say one word to you. Just one word.
    Benjamin: Yes, sir.
    Mr. McGuire: Are you listening?
    Benjamin: Yes, I am.
    Mr. McGuire: Components.

    The days of large monolithic tightly coupled applications with a plethora of bolt on interfaces will die in favor of a Service Oriented Architecture composed of Loosely Coupled Component Services. The SOA will impose and control access and even if a programmer does omit something from his new component, it likely won't work in the SOA because it will violate a security policy and even if it does manage to present a security porthole, it will only be into the data and methods of THAT component service.
    • RE: To paraphrase "The Graduate"

      When do you expect this to happen? Who do you propose that I trust with our confidential company data? Is there a company you have in mind?

  • CODE?

    Remember license and register are for the government only!For this illegal EULA to be in every software program it would have to be that software is written right in the website.I link to the site and click in the stuff that I want to be in this software program.I choose a font type,skin then compile the program.Even the HELP section gets auto compiled.I could then download the program.This is done in the Federal Internet system.The hackers put virus like EULA in right there.The software writing section in the link is then blocked by the hacker.
  • The natural state of a programmer

    is to find linear A-to-B solutions to problems. The panelists say when programmers move to a new technology they forget their security best practices. To a programmer security practices are like cruft. They're things that enhance the software, make it better. A programmer can feel proud of that. But when they're learning a new software technology--a new language or tool, plus trying to deliver something by a deadline, the programmer can feel so pressured to go back to their natural state that they just throw security practices out the window. They'll probably feel there's no time to learn how to work with the new technology and use it more securely at the same time. Learning the extra techniques takes getting comfortable with the technology first. The only way that can really happen is to do a project or two in it.

    I think the only way software security will become pervasive is if liability associated with insecure commercial software. That would make companies, and by extension developers, pay attention to this issue more.

    Speaking for myself, I want to write good software. I don't want people having problems with it. Even in my case though, if the software company I work for doesn't care about the issue, I'm going to be forced to skimp on it, or they'll find someone else who will. I don't suggest liability as a way to make programmers pay attention to this. I suggest it as a way to make the companies that employ them do so.

    Educating customers will help as well. If they are not, they're likely to be confused about it. It may disadvantage companies that comply with the law.
    Mark Miller
    • The natural state of a programmer

      Company liability is inherent in a free market economy. But of course that all goes out the window when you're dealing with a monopoly. Funny how that keeps working against the public interest.

  • Why possible?

    You know, I was asking someone just the other day: if a program can be written in a variety of different programming languages, and new languages are being created or modified all the time, why do we keep seeing the same tired old flaws like buffer overflows over and over again? Why hasn't someone designed a language in which they simply aren't possible? Or have they, but it just hasn't caught on? You'd think people would take a step back from the programmers, who clearly have not learned from the mistakes of their predecessors, and look at the actual mechanics of the languages they're working in...