X
Tech

NSA on Heartbleed: 'We're not legally allowed to lie to you'

In an exclusive interview with ZDNet's David Gewirtz, a senior NSA official explains why the agency regards security and civil liberties as more than a simple balancing act: "You have to have them both."
Written by David Gewirtz, Senior Contributing Editor

ZDNet recently had the opportunity to sit down and discuss how the NSA approaches the difficult challenge of both protecting our security and supporting the American ideals of openness and transparency. The recent Heartbleed bug brought these issues to light in a particularly relevant way.

This article contains a transcribed version of that interview. Other than a few housekeeping clean-ups, the interview is verbatim.

Before we start, I also want to point out that I was permitted to ask whatever questions I wanted, and Neal Ziring, Technical Director in NSA's Information Assurance Directorate was willing to answer them.

Here are some tidbits from the full interview:

  • "Vulnerabilities are very precious currency in cyberspace."
  • "...have we looked at OpenSSL? Of course we have. We, like a lot of other folks in the community, did not spot this particular vulnerability."
  • "Did we find it? No. We did not. A lot of other people didn't find it, either."
  • "We have people that are spread thin, looking at a lot of different pieces, and we just didn't catch this one."
  • "With a super-widespread vulnerability like Heartbleed, it's going to be in the national interest of the U.S. government to act to eliminate that vulnerability"
  • "What we try to do is understand those systems to a very deep level so that we can understand where they're strong, where they're weak.."
  • "I'll be frank with you. If it comes down to protecting against a terrorist or protecting against his privacy, I'm afraid that his privacy is not going to be the number one thing on people's minds."
  • "The only way to improve the trust is for us to be out there and transparent."

And now, the full interview, in its entirety...

ZDNet: Let's start out with some background. Tell us about yourself.

Neal: I've been at NSA since 1988, and I worked mostly in evaluation of security products, crypto products, things like that. I worked a little bit on mobile code security, executable content security as you sometimes call it at the DOD and IC levels. I worked a lot on router security, so if you go on NSA.gov, you can see some of the products I worked on, the guidance available on security, and I worked with NIST on security content automation protocol.

"The folks that are here on the inside get to see the intelligence that they're delivering every day. They get to see a soldier go home who they gave intelligence to his platoon so it wasn't ambushed."

Then I spent four years working over in our Technology Directorate as a security architect for some large government systems including some that went out to the field like Iraq and Afghanistan. Then I came back to my home, IAD Directorate, to be the Technical Director. You can think of it like a senior technical advisor position to our director.

Then, through all of that, like the last decade or so, I've been working a lot in our academic outreach efforts. We have a large program for designating Centers of Academic Excellence in information assurance and cybe security education, and I work on that, especially with the high level one called "CAER," which is for research universities.

ZDNet: What's your specific role here at NSA?

Neal: I said I'm from the Information Assurance Directorate. As you know, NSA has these two missions, a signal intelligence mission and an information assurance mission. I've spent most of my career in that information assurance mission, so my answers are going to tend to be from that viewpoint, although I understand the other mission.

ZDNet: What exactly is information assurance?

Neal: Information assurance, most people today would call it cybersecurity, although it's actually a little bit broader than that, but it's the art and science of being able to use our information with confidence in military systems, other national security systems, and being able to ensure that it is confidential when it needs to be confidential, that it has integrity when it needs that, that we have freedom to maneuver in cyber space.

Our motto for the Information Assurance Directorate is "Confidence in Cyberspace." We try to provide that for a very wide spectrum of customers. Under National Security Directive 42, our primary customer set are the national security systems, which are all your military and intelligence systems, certain other government systems that have to do with maintaining the national security of the U.S., but we actually do a lot beyond that in working with other parts of government and assisting government agencies at their request, for example, DHS with critical infrastructure, so we have a pretty broad mandate, and we've been operating in that mode since NSA was founded in the early 1950s.

ZDNet: Moving on from that to NSA's global mission, where do you see the NSA's global mission with regard to the digital systems data?

Neal: NSA's global mission from the intelligence side is to continue to produce actionable intelligence to the United States for the decision makers, the war fighters.

That mission hasn't gone away. It really hasn't changed. We still face a lot of threats in the world, and it's very important that the President, the military commanders, etc. have reliable intelligence about all those issues, so that hasn't changed.

Similarly, our information assurance global mission hasn't changed. We have to continue to provide products and services that protect national security systems wherever they may be. We have people deployed to the Middle East, working with folks on their networks out there right now. IAD [Information Assurance Directorate] does.

Those missions haven't gone away. They're complex. They keep getting more complex as our adversaries gain new tradecraft, but we're continuing to prosecute those and to partner with the folks we need to partner with all the time.

"...and certainly, NSA does have folks that look for vulnerabilities. We have to."

ZDNet: NSA is in a damned-if-you-do and damned-if-you-don't place with Heartbleed. If you did know about Heartbleed and didn't help get it fixed to protect us all, you're irresponsible spies. But if you didn't know about it and it's been there all along, in a very well-known piece of software, you're incompetent. So, which is it? Irresponsible or incompetent?

Neal: Yeah. This is a very tricky question, and certainly, NSA does have folks that look for vulnerabilities. We have to.

Vulnerabilities are very precious currency in cyberspace, and have we looked at OpenSSL? Of course we have.

We, like a lot of other folks in the community, did not spot this particular vulnerability. It's also very interesting to note that automated code scanners … I've talked to SMEs [Subject Matter Experts] both inside and outside NSA about this, didn't catch it either.

If you look at the structure of the OpenSSL code involved, it's a number of reasons why that could have been the case. Nobody knows exactly. The various companies are scrambling to rectify that at the moment, at least a few of them that I've talked to.

Did we find it? No. We did not. A lot of other people didn't find it, either. There's a lot of software out there. We have people that are spread thin, looking at a lot of different pieces, and we just didn't catch this one.

It's important to note that we focus on the technologies, the systems that we think have the most impact for national security. We have to prioritize on that basis.

The interview continues on the next page...

By the way, I'm doing more updates on Twitter and Facebook than ever before. Be sure to follow me on Twitter at @DavidGewirtz and on Facebook at Facebook.com/DavidGewirtz.

ZDNet: We know that decrypting enemy traffic can have a profoundly positive impact on protecting the nation. Ultra from World War II was certainly one case in point. If you were to have known about the Heartbleed flaw and were able to get substantially valuable intelligence take as a result of that knowledge, would you not have to deny that knowledge anyway? Which is better? A huge intelligence take or avoiding bad public relations?

"There's no shortage of cyber security challenges out there today."

Neal: No. We're not legally allowed to lie to you. I am telling you, we did not know about it, and that's the case. Now, in this sort of vulnerability, if we know about it before others, then we do have a choice to make, the U.S. government has a choice to make, and ultimately, that choice is guided by the national interest.

What is going to create the greatest overall benefit to the national security of the United States, using a vulnerability to gain critical intelligence or fixing it to bolster the security of the U.S. systems? We are also biased towards the defense, towards ensuring that the U.S. risk is mitigated.

With a super-widespread vulnerability like Heartbleed, it's going to be in the national interest of the U.S. government to act to eliminate that vulnerability, to get it fixed as the only sure-fire means of mitigating it, and now, we do that all the time.

Now, a thing I'd like to refer you to that discusses this whole topic in some detail is a blog entry that was posted at the White House blog something like half an hour ago.

(Editor's note: This interview was conducted on Monday and shortly after, posted coverage of that document. You can read more here.)

It's under Michael Daniels' blog, and it goes into some detail on this matter. You can read it, and it gives — near the latter half of the blog posting — some of the considerations that would go into how the nation should make this kind of decision, and it is a national thing.

You'll see in the blog entry, there's an inter-agency process for making these kinds of decisions.

You have to think about if the vulnerability is left unpatched, how much risk does that impose on the U.S. systems? How much harm could an adversary nation or criminal do if they had knowledge of that vulnerability? How likely is it that we would be able to detect if someone is exploiting it and mitigate it through other means?

On the intelligence side, how important is the intelligence that we might be able to gain through exploitation of that vulnerability? That's something you have to think about very carefully. What if there's no other way we could get it, that piece of critical intelligence? You can see this in Michael Daniel's blog entry.

Another big consideration is whether the vulnerability's risks can be mitigated in some other fashion such as by blocking at a network level or upgrading to the latest version of the software, things of that nature. All those things get taken into account.

ZDNet: Does NSA have any specifically defined responsibility to dig into operating systems and encryption systems and find vulnerabilities? If so, how big an operation is that? Where did something like OpenSSL fit into that responsibility?

Neal: It is an aspect of the protective role. What NSD-42 does say... I can't quote it verbatim from memory... But it basically designates the national manager as protecting and defending national security systems. An essential part of that is understanding the vulnerabilities in those systems. It follows pretty directly from protecting and defending that you need to go out there and look for vulnerabilities in the elements of those systems.

Now, in the old days, this is kind of a key point I wanted to bring up. The systems that were used for national security, for example, for military communications, and those systems that were used by the commercial world in the private sector, were pretty much disjoint. You had your military COMSEC Type 1. That's the stuff I worked on early in my career. Then you had the commercial side, and they were entirely separate. That's no longer the case.

The national security systems still use a lot of the Type 1 COMSEC, but they use a lot of commercial stuff, commercial operating systems, commercial products, commercial encryption...

ZDNet: I'm guessing you include open source in that as well?

Neal: Certainly, yes. OpenSSL is open source, but there's a lot of other open source out there that folks use. For example, the Android operating system, which is used on a lot of mobile devices is an open source operating system. Linux is an open source operating system. We use plenty of that as well.

We have to understand that there are vulnerabilities in all these things. What we try to do is understand those systems to a very deep level so that we can understand where they're strong, where they're weak, how they can be correctly employed, how they can be integrated with other systems. We write guidance for the rest of the national security community about these things.

A lot of that guidance is unclassified, and we made the decision years ago that when that was the case, that we would simply post it to NSA.gov where it could benefit not only our national security customers but anyone else who might benefit from it, and if you go there today, you'll find dozens and dozens of technical documents ranging from little one-page sheets up to multi-hundred page detailed guidance that we've written internally and published. That's a big part of our job, and on Heartbleed, we had guidance issued and posted in under 24 hours.

The interview continues on the next page...

By the way, I'm doing more updates on Twitter and Facebook than ever before. Be sure to follow me on Twitter at @DavidGewirtz and on Facebook at Facebook.com/DavidGewirtz.

ZDNet: How big an operation is that to manage those vulnerabilities?

"We are not hiding from the public. We are out there. We're staying out there."

Neal: I can't talk about the number of people specifically engaged in any particular mission function. I can say that we have several hundred people in IAD who are specifically designated to analyze, understand and create guidance for commercial products.

We also have a bunch of operational teams that try to apply this knowledge in the field.

We have several different teams of this kind, but I'll just mention two. One is the Blue Team that go out and work cooperatively with network owners all over the world to understand the vulnerabilities in their networks and fix them, and then we have the Red Team that basically act as adversarial emulation, and they go out and, under controlled conditions, will actually attack U.S. systems in order to reveal their vulnerabilities so that those vulnerabilities can be fixed.

It's an interesting bit of trivia that when the news about Heartbleed broke, our Red Team was working on an annual educational cyberexercise that we sponsor with the military academies where they have to defend and our guys attack, and in just over 24 hours, the cadets at the military academies were being hit with Heartbleed, within the controlled bounds of the exercise, because we thought it would be an excellent educational opportunity for them to see how they dealt with a brand new vulnerability, and it was, we think, a good educational success.

ZDNet: I'm sure they loved you.

Neal: They loved the Red Team coming after them, all right. They know what they're getting into. That's why they're in military academies.

They stumbled at first, but they recovered very quickly, the cadets. They know what they're doing, and within a few hours after that, they had the mitigations installed and up, and they were no longer vulnerable to it.

ZDNet: If you do find a vulnerability, how do you make the decision between exploiting them or helping to get them fixed? Who makes that decision and at what level?

Neal: I'm going to refer you to that blog entry again. As the blog entry says, there is a government-wide inter-agency process for that, and that process is used. It is executed at a classified level because a lot of this has to do with intelligence sources and methods, and every agency will get to have its say. For example, DOD and DHS will get to put their equities on the table and come to a decision about that for the government. It is biased towards the defense.

There was always collaboration on that. This is trying to make it more formal, more structured. Make sure with greater certainty that everybody has their say.

ZDNet: We're dealing with this sort of balance between privacy and security. We have all of these threats, and yet we also have America and a need for privacy and security. How do we balance those things?

Neal: It's not a simple balance. We don't like, within NSA, to say that we're attempting to balance security and civil liberties. When we talk about that internally, we usually draw a rail car on two rails, and you can't have only one rail. You have to have them both.

Now, I'll be frank with you. If it comes down to protecting against a terrorist or protecting against his privacy, I'm afraid that his privacy is not going to be the number one thing on people's minds. His privacy to plan his terrorist act. As a member of the intelligence community, I want to find out about those so that they can be stopped, but that's the intelligence game.

"It's not a simple balance. We don't like, within NSA, to say that we're attempting to balance security and civil liberties.... It's not security versus privacy. It's always both."

The folks who are not the subject of a valid foreign intelligence need, we go to enormous effort within NSA to ensure that their civil liberties and privacy are not threatened, that we are internally transparent with our overseers within the Executive Branch, with our overseers in the Legislative Branch, our overseers in the Judicial Branch, and I've had enough interaction myself with the Legislative Branch side to know that they do not give us a free ride.

They are adamant about knowing what we are up to and ensuring that we keep them informed, and the same is true of the FISA Court, although I have not had a case and interacted with them directly myself. It's not security versus privacy. It's always both.

A lot of that is due to our former director, General Alexander, and I watched him say it enough times, and I believe in it. That's what we do here.

The interview continues on the next page...

By the way, I'm doing more updates on Twitter and Facebook than ever before. Be sure to follow me on Twitter at @DavidGewirtz and on Facebook at Facebook.com/DavidGewirtz.

ZDNet: NSA has had a really tough year in the spotlight, but you've still got a really critically important job to do. What can the agency do to regain the trust of Americans and, frankly, those around the world?

Neal: I'll answer part of that…

[Neal deferred to Vanee Vines, a representative from NSA's Public Affairs Office.]

We are not hiding from the public. We are out there. We're staying out there. For example, the RSA Conference was a couple months ago. There was a lot of hoopla around that, people boycotting and so forth. We had a booth reserved on the RSA exhibit floor. We had speakers on panels, and we went. We were out there.

The only way to improve the trust is for us to be out there and transparent. We have extensive outreach efforts with academia that I'm personally involved with, and with industry, and we are continuing those. In fact, in some cases, we've tried to step them up. Now, in some cases, it's a little uphill, but we're working on it.

[Vines echoed his comments, adding that agency officials, aiming to be as transparent as possible, are increasingly engaging in media interviews, public conferences, and events in university settings. She also pointed to a May column by Geoffrey R. Stone, a University of Chicago professor who served on the President’s Review Group.]

Neal: Something else I think was very interesting, and you may have alluded to it briefly is that [former NSA Director] General Alexander brought in several different groups of leading academics, both from legal and compliance viewpoints, from computer science, from privacy, from big data sciences, and I had the privilege to participate in two of those sessions, and in answering their questions and helping them to understand where we're coming from.

"The only way to improve the trust is for us to be out there and transparent. We have extensive outreach efforts with academia that I'm personally involved with, and with industry, and we are continuing those."

I think efforts like that, reaching out to these folks who think deeply about these issues and then have an opportunity to form their own opinions, I think is very important. That's all part of the whole transparency thing.

ZDNet: You folks are working really hard. This is not an easy job to do, what the people in your agency do. How's the morale there? How are people feeling who are working there? Do they feel attacked or are they able to focus on what they're doing? What's the general mood?

Neal: NSA is actually [in] a pretty good place in terms of morale, not to say parking couldn't be better, but the folks that are here on the inside get to see the intelligence that they're delivering every day. They get to see a soldier go home who they gave intelligence to his platoon so it wasn't ambushed, stuff like that, so that's a big boost to morale.

Now, we had some problems over the summer with the budget and sequestration and the furlough and all that. That wasn't easy, but we made it through it. I think that the morale around here is really based on how well we do the mission and when we're doing the mission and we're on game like we are now; we're doing great things, then morale's usually pretty good.

ZDNet: The ZDNet audience consists of some of the more influential IT leaders both in America and across the world. Is there anything you'd like specifically to tell them that I haven't mentioned or asked?

Neal:That's a tricky one. I think something that the tech community should know, and I've said this to some folks that I've talked to at RSA is we, [Information Assurance Directorate], we, NSA, are still working for the national security of the United States.

That means helping commercial products to get more secure. That means helping standards to be more secure. We haven't backed down from any of those things. We're still working with NIST. We're still working with a bunch of other standards bodies and industry groups. I can't name them all. There are a lot of them.

The industries should know that we're going to continue to do that because from my viewpoint, our dependence on stuff that they create is tremendous. It's got to be strong. We're not out to weaken those products, which then our guys are going to use. They need to be good, and we're incorporating them into our systems in new ways.

You may have noticed what we're doing in the commercial solutions for classified initiative. We're saying that well-tested commercial products integrated and architected in the right ways are strong enough to protect classified information. We're using this stuff along with everyone else.

ZDNet: Let me sneak in one last question since you're doing the university outreach you mentioned earlier. I am always asked how to people start a career in cybersecurity. Do you have any advice?

Neal: I get that question, too, sometimes, especially if I'm doing any kind of outreach for young undergraduates, folks like that. There are a couple of ways. The most obvious way is to get a good degree in a STEM discipline, some form of engineering or mathematics, IT networking. My own educational background's computer science.

Of course, I encourage them to go to one of our designated Centers of Academic Excellence because we know about the curriculum at those places, know that it's solid, and that they have good faculty. The main thing is to have a love of solving problems, and get the technical foundation to be able to solve them and then move into solving them.

Especially at schools these days. There are so many opportunities. A lot of them have cybersecurity clubs or participate in competitions such as Capture the Flag or the Collegiate Cyber Defense Competition. There are even places that have degrees specifically in information assurance, cybersecurity where they have it as a concentration or focus within an engineering degree.

That's my advice. Go after solving problems. Get a good technical degree underneath, and then bring your skills to government or to industry. There are lots of places where one can have that kind of career, and there's no shortage of cyber security challenges out there today.

Editorial standards