X
Innovation

Q&A: Adrian Gardner, CIO, NASA Goddard Flight Center

NASA pursues more powerful ways to deliver data across global networks of scientists and engineers -- and even add a Watson-like supercomputer to the International Space Station.
Written by Joe McKendrick, Contributing Writer

In recent times, the National Aeronautics and Space Administration has been actively pursuing new, more powerful and cost-effective ways to deliver its vast amounts of data across its networks of scientists and engineers, as well as share its discoveries with the rest of the world. That's why the space agency has been aggressively adopting the cloud model – both from within its own data centers and with outside partners – to increase its computing power.

NASA provided the genesis for the open-source cloud platform, OpenStack, originally created as the agency's Nebula Cloud project and now hosted at RackSpace. 

I recently had the opportunity to talk with Adrian Gardner, chief information officer of NASA's Goddard Space Flight Center, who also has been active in formulating the Society for Information Management's cloud research. He provides background on NASA's move to the cloud:

SP: How is cloud changing the corporate culture of NASA? Are you achieving more flexibility?

Gardner:  It's huge for us, in a number of ways. It will change the business model for compute. The Agency, as well as Goddard Space Flight Center, is experimenting with virtual desktops, where we would actually host an engineering desktop in the cloud.  While in the field, engineers would have the option to carry laptops or we could provide them the opportunity to virtualize their desktops as an application on their iPads.  This gives us greater agility.

I also see it as a game changer for the collaborative nature of NASA, with regards to working with foreign nationals.  Virtualization would allow us to provide and limit access to a variety of NASA resources in record time.  After a decision has been made to grant access, it will no longer take days and months to access NASA data; it would now be a matter of hours and minutes.

SP: Is the International Space Station in line to be part of NASA's cloud network?

Gardner: I don’t think it will be part of the cloud network, but we have talked about doing things like moving [an IBM Watson supercomputer] up to the space station to do real time problem analysis and decision support. From an innovation standpoint, I think that there are some places where a cloud infrastructure could play a role.  We’re talking about a miniature “Watson” on the International Space Station, and a larger “Watson” at the control center, and the two would work interchangeably to extract information from different sources, including cloud-based sources.

SP: Is NASA's cloud computing effort more internal, with its own private cloud, or external?

Gardner: We're looking at both internal and external cloud options that would include both private and public cloud service providers.  Three years ago, we initiated a project called Nebula, a cloud platform that was intended to focus on computational science.  We also discussed using the infrastructure for email and calendaring and the like, but we really aimed to focus on NASA's computational science requirements.

As we started out in that endeavor, we began to think about our own internal ability to maintain pace with the private sector cloud service providers and their development activities – the Microsofts and Googles of the world. We entered into what we called a Space-Act Agreement with RackSpace, a cloud service provider that had their own cloud platform. They actually adopted our software stack as their capability, and we then open sourced that capability.  It now takes the form of OpenStack, an open source cloud service provider. That, of course, was our initial approach.

We're now looking at ways that we can begin to leverage a cloud broker.  It would allow us to migrate or transition from a public cloud to a private cloud seamlessly. The industry is changing so rapidly... When we first began talking about a broker of clouds, there were only one or two companies that could actually provide that capability; now, six to eight months later, there are more than ten.

SP: NASA has its own supercomputers. How will that fit with a cloud strategy?

Gardner: We actually own one of the two supercomputing capabilities within NASA. Instead of a discussion about cloud versus supercomputing, we're now having a discussion of how to utilize cloud within the existing computational landscape--all the way from a desktop, through a mobile device, into a physical data center, into a virtualized data center, into cloud, and finally into supercomputing. There’s a whole spectrum of compute that is available to our scientists and engineers.

Now, the question becomes, what attributes would drive me into one environment versus another? Attributes such as security or privacy may drive us out of the cloud space into a physical data center; however, if we have a scientist or engineer who has to wait up to three months in the queue for access to the supercomputer, for a job that may take them a week to complete, the cloud environment would be an alternative solution, and we could potentially decrease the wait time.  We would like to create this kind of agility across the Agency for all of our scientists and engineers to leverage.

SP: Is your cloud effort part of the government's Shared Services initiative launched last year?

Gardner: Absolutely. We’ve already begun to have conversations across multiple agencies to discuss how we would purchase or develop requirements that would allow us to create a cloud broker as a shared service among many agencies. Also, from a standpoint of our capabilities in general, there are folks who have come to NASA and asked us to leverage our cloud capabilities. That discussion has always occurred around the supercomputing capabilities we currently have. We’re definitely starting to self-organize into how we would provide services, not only within the NASA geography, but across a lot of government agencies.

SP: NASA has a technically oriented workforce, with a lot of smart people.  It seems natural for NASA to host a federal government cloud.

Gardner:  It is. We’ve really focused on the competencies, the things that are a necessity for our agency; but, when we step back 100 feet or so, and start to question whether or not those needs also meet the requirements of other agencies and other organizations, we see that there is an opportunity now for us to think about these requirements in the larger context of shared services.

SP: What is the ultimate capacity of your cloud? Is it limitless?

Gardner: I wouldn’t say that it’s limitless, because there are always going to be challenges. You always have to think about data sensitivity and security. There are limits that we must impose as we look at cloud to ensure that we are making an informed decision, as far as what goes into the cloud and what has to then be hosted in other – and probably more hardened – kinds of infrastructures. I think it will provide huge opportunities for a place like NASA. We definitely do have some sensitive data and products that we would probably not host in the open, or in a public cloud; however, for the most part, we are one of the agencies that probably is best positioned to take advantage of some of the cloud capabilities that are available in the free market place, to include open source.

This post was originally published on Smartplanet.com

Editorial standards