X
Tech

Chris Wysopal, Veracode: U.S. Government worst at data security

One of America's most respected security researchers discloses that the U.S. Government's data security practices are shockingly negligent. This is what CISPA supporters don't want us to know.
Written by Violet Blue, Contributor

As CISPA (the Cyber Intelligence Sharing and Protection Act) poises to reach the Senate, ZDNet tells you what CISPA's proponents don't want you to know: one of the most highly respected security researchers in the U.S. recently disclosed findings about the U.S. Government's own shockingly negligent data security practices.

In light of Chris Wysopal's data, it appears that the only thing widely-opposed CISPA will protect is big business data deals and cybersecurity-shilling government contractors.

So while CISPA's supporters don't want you to peer behind the curtain, and have no problem leveraging the Boston bombings to push CISPA, let's find out what they don't want you to know about how they actually protect American data from cyberattacks.

cispa

In 1998, Chris Wysopal was one of seven notorious hackers who testified to the U.S. Senate that they could bring down the Internet in 30 minutes.

In 2011, Wysopal was invited back to provide hands-on information about hacking and cybersecurity - to the very same committee he testified to in 1998, the Homeland Security and Governmental Affairs Committee.

Wysopal explained, "These committees get a lot of people coming in from Symantec and McAfee and telling them what they think the solutions are." Because he hoped to see "intelligent legislation come out of the Senate," Wysopal said, "If I can show these people how these attacks work from a vendor-neutral perspective, I thought that was a great opportunity."

chris wysopal

After Wysopal ran his demonstrations a Senate staffer said to the room that, "maybe we should just outlaw them [hacking tools]."

It must have really bothered him, then, to have discovered through Veracode's most recent risk assessment data findings that the US government is taking the most dangerous risks with our data than any other industry, period.

I'm not talking about Veracode's new State of Software Security Report (though it's an eye-opener: Veracode found that 70% of applications failed to comply with enterprise security policies on first submission - so really, there's no need to outlaw hacking tools when the companies just leave the damn door unlocked).

For me, some of his observations about the U.S. Government, cybersecurity and sacrificial information sharing make CISPA's possible passage seem even more a wholly, terribly misguided disaster.

Mr. Wysopal told ZDNet today in a comment about CISPA, "if you see any cyber security legislation and they don't talk about holding software vendors accountable for minimum security robustness in their products then you know who is winning at the lobbying game."

Chris Wysopal recently presented at global hacking conference Hack In The Box in Kuala Lumpur, Malaysia, showing an audience comprised of bleeding-edge security researchers, global hacker underground denizens, corporate headhunters, and members of law enforcement just how the U.S. Government is taking dangerous risks with citizen data.

I spoke with Mr. Wysopal in Malaysia right after his incendiary Hack In The Box Malaysia 2012 talk and asked him just how vulnerable U.S. Government data is at this time.

Violet Blue: You're the CTO of Veracode, whose recent risk assessment data findings supported your talk's conclusion that the US government is the worst at data security in comparison to other industries. Can you explain?

Chris Wysopal: The data came from testing Veracode does for its customers. We were testing all applications and compared by industry verticals. For instance, we'd expect financial services to be doing a good job because if they don't, real money gets lost. We discovered the worst sector was government.

The U.S. government ...is not doing as good of a job as, say, the financial services industry.

If you said, let's look at ten financial services web apps, we could pick ten banks at random and start looking for cross-site scripting - and then if you pick ten government websites, you'd find more with the government.

However the U.S. government is building its applications or procuring its software applications (that it uses to basically do all the things that they do with our data as US citizens), they're not doing as good of a job as, say, the financial services industry. This is showing up in one class of very common vulnerability, cross-site scripting. The data showed that whatever they're doing, the government is not requiring better coding.

VB: How does that feel considering your history of telling the U.S. Government to get its act together with understanding cybersecurity?

CW: It's frustrating because certain parts of the government know more than anyone else about these problems. Think about the NSA, or other intelligence agencies, or the military. They know that this problem exists very well, but they don't do anything about it in other sectors of the more civilian government. The sectors that do the day-to-day with our data, like the Veteran's Administration. And it's these organizations that have a lot of interactions with citizens over the web, like the IRS.

And they're doing a worse job with data security than if you we're dealing with your bank down the street.

VB: Something else you pointed out in your talk was that you're also seeing actual security software performing very badly.

Security companies...don't translate that understanding into making their products better.

CW: My joke is the cobbler's children have no shoes - right? I find that even though many security companies understand the many vulnerabilities that are in software - they have a real understanding of it - they don't translate that understanding into making their products better. They don't train the developers, they don't give enough time for the developers to do testing, you know, the remediation they would need to do to make a more secure product.

So they're aware. But awareness is only the beginning; you need to put that into practice. The security companies are the ones most on the awareness side, and the least on putting it onto practice. It's really hypocritical if you think about it.

VB: What do you see as the impact on the security ecosystem for this behavior?

CW: It's bad because it makes it so that consumers rely on security products to protect from vulnerabilities - and while they *do* do that, the products end up introducing new vulnerabilities that wouldn't be there if you didn't install the security product in the first place.

VB: But security products should be the best, right? Isn't this a reputation based economy? Or does market saturation exacerbate the problem? I'm just trying to understand why this continues, and possibly what could course-correct this.

CW: This is why we want to expose the data in this manner. It shows where the vulnerabilities lie. I think that when we point out different categories, businesses buying software can say wait a minute, Vercode says that security software is the worst in these categories. Security software maker: maybe you're the average, just like what Veracode's data is saying - so now we want you to test it. I think Ralph Nader exemplifies this approach because of the way he changed the auto industry in the 1960s. Nader said, "cars are unsafe." That's when people started to think about it in the terms of some cars as safer than other cars.

People can start to say, "I want to buy the safer software, and if someone is giving me crap software, I want to hold them accountable and point it out and say, you better fix that." At this point in time people think 'all software has bugs' and 'all vendors should do better.' But we're using real data to raise awareness and change this mindset.

Software is at the point that software is equivalent to car safety.

An individual consumer like you and I buying software - compared to a large software company, we have no power at all. But a top five global bank, has power over a 2 million dollar contract with a software vendor that might have only 100 million in revenue. When there's buyer leverage, it can force vendors to be better. I think we can start to chip it away by starting to educate enterprise buyers that they should be holding their vendors accountable.

Software is at the point that software is equivalent to car safety. I'd like to solve this before everything we interact with is part of the internet. Software now controls financial transactions, it mediates communications, and it's starting to control physical systems in industrial places that can be pretty compartmentalized. Software is becoming things were wearing, things we rely on to be physically safe. Cars that are autonomous, self-driving, can be hacked. So we have to raise awareness now. I started doing this in the late 199'0s when I testified in front of the Senate.

VB: What is the answer?

CW: I think a start would be something where the root causes of things had to be disclosed, like the NTSB. that would be like step one. It's like, you can't just hide your dirty laundry. You have to allow people who want to learn, to learn from it. So once there's awareness of issues, you become negligent if you don't take that information into action.

So someone who is doing the exact same thing somewhere else can't act like they didn't know about it. So if you're in this industry it's negligent to ignore this information because that information was there.

So when the government talks about when they want to do information sharing, what they talk about as "information sharing" is malware information sharing. They want to share signatures for attacks. They want other people once someone fingerprints an attack, tool or traffic, so other people who want to detect that can have that information. Though there are privacy implications as to how we're gathering this information. But why not share vulnerability information after the problem is cleaned up?

VB: But then how do you get around the problem of people misconstruing information sharing for advocacy? Where they say, you can't publish this because that's equivalent to telling people to go do it, or handing out so-called "loaded guns."

CW: Look - the bad guys already have that information.

Maybe there are a few bad guys that didn't know the information, they weren't connected, but there are more people in the dark that are capable for using the information to solve the issues than there are people with bad intentions that are "in the know." This goes back to the l0pht, the grey hat mentality, of like, yes - we're going to put up password cracking tools, because attackers already have them and are also building their own. So by making this easier to use and available, we're giving more help the the average sys admin than we're giving more help to the attacker. That's my mentality around full disclosure.

Read Veracode's new State of Software Security Report Volume 5.

Photo: Chris Wysopal onstage during Hack In The Box. Credit: Violet Blue.

Editorial standards