NEW YORK — Werner Vogels, CTO of Amazon, is the front man for Amazon Web Services and a technologist who preaches the public cloud but recognizes hybrid is the enterprise reality.
Following his AWS Summit 2014 keynote, Vogels and I talked a bit of shop. The interesting thing to note here is how the AWS messaging has evolved over time. AWS, led by Vogels, would talk about how there was real cloud vs. fake cloud. Today, AWS recognizes the hybrid cloud reality, but obviously is betting that the balance will be more tipped to the public compute side of the equation.
Like its parent Amazon, AWS is customer first. Any new product or service has to be pitched in a way that is geared toward customers. Werner yaps to chief information officers constantly. It's part of his job. CIOs may like the all-cloud vision, but can't simply throw out what they have. Some functions may have to stay on-premise.
Here's a look at a few highlights of our conversation, which covered hybrid cloud, technical debt, mobility, OpenStack and big data.
On AWS's mobility push. Vogels said the cloud and mobility are inevitably intertwined. He also said that there is less content and data consumed on devices specifically. "Younger businesses are mobile first," he said. AWS' role is to take the complexity out of the development process — via central ID management or virtual workspaces — and enable innovation and agility. "CIOs have been giving us feedback. BYOD (bring your own device) is important, but they don't want to manage devices. They want to manage virtualized workspaces and a fully provisioned environment," said Vogels.
In other words, AWS is going where other mobility players are headed. Mobility is about the collaboration and identity management more than the devices. Device management equates to table stakes. It's also worth noting that AWS — along with Google and Microsoft — are going to make life hell for Box and Dropbox on pricing on the doc sharing and collaboration front.
Hybrid data centers. "Hybrid is important to us," said Vogels. "Obviously, we're the public cloud, but the reality is that in the enterprise there will be things on premise." Indeed, the biggest question is how hybrid will be defined in the future. Is it 90 percent on premise vs. 10 percent public cloud? Vice versa? Or something in the middle? Door No. 3 is the correct answer, but good luck defining the middle.
Vogels noted that companies like News Corp. are consolidating datacenters from 40 to six via AWS and that's their definition of hybrid. AWS' plan is to offer a bevy of tools such as virtual private networks and direct connections as well as identity federation to connect to on-premise data centers and let the chips fall. AWS' move to offer VMware management integration highlights how the company wants to plug into the hybrid world. See: AWS vs. VMware: Is a cloud collision inevitable?
A talk by Yinal Ozkan, Principal Solution Architect at AWS, highlighted the nuances. Use cases for AWS abound and range from offloading storage and analytics to the cloud to disaster recovery. For instance, Samsung's Smart Hub TV software runs on AWS, but financial transactions are handled by the on-premise infrastructure, said Vogels. Why? Multiple pieces of Samsung's business relied on the infrastructure for transactions and it was too hard to move. In addition, banks will run customer facing ops on the cloud, but transactions will run on financial institutions' datacenters.
So how does the hybrid datacenter shift to be more public? Gradually. Vogels said that high performance computing (HPC) is a key area where the cloud comes into play. Companies in industries such as oil and gas and entertainment have invested in HPC systems, but the on-premises' resources may be booked for months.
"On premise HPC is so expensive that they are loaded up 100 percent of the time," said Vogels, who added that extra work will have to move to the cloud. External events — and the compute resources needed to analyze them — often drive the move to the cloud.
Legacy infrastructure and technical debt. AWS and Amazon both have technical debt — legacy infrastructure that can't simply be jettisoned — but Vogels said the key is building an architecture that doesn't lock you in.
Vogels said that Amazon has assumed that software running today won't be enough in two years. Software has to be capable of evolving over time. "That means we are not locked into previous generations of our systems," said Vogels. "Of course we have technical debt, but can change our systems and operate at scale. We're in a less disadvantaged situation than our customers are."
It's worth noting that Amazon is a hybrid shop to some degree. Amazon’s retail business mostly runs on AWS, but its cache of product data runs on-premise, explained Vogels. That product information is served up on hardware specifically designed for that use case. "We're leaving that on premise, but developing the next generation for the cloud," said Vogels.
AWS phases out legacy infrastructure based on what customers want. For instance, AWS is on its second generation of instance types and demand is phasing out the older versions. Vogels did say that old systems will remain in other capacities. The average life span of an on-premise HPC system is five to eight years, but researchers are complaining at the two-year mark because they don't have the latest processors. "We can phase those HPC systems into general usage and have a faster refresh cycle," said Vogels.
When it comes to legacy infrastructure, enterprises are looking to retool architecture and future proof rather than lift and shift infrastructure to the cloud, said Vogels.
Big data, MapReduce and Hadoop. Google recently noted that MapReduce was played out and could only go so far. Vogels agrees.
Ultimately, "MapReduce sinks into the lower layer," said Vogels. To use Hadoop and MapReduce custom analytics are critical. Ultimately, MapReduce will be used as part of the equation, but not be the entire picture. The popularity of Amazon's Redshift service revolves around fast and simple analytics that MapReduce can't quite provide, he said. There will be plenty of applications for MapReduce, which has a big developer community, but in the end, it’s one asset in the big data mix.