I spent much of last week in the depths of a Barcelona conference center at TechEd Europe, catching up with Microsoft staff and talking about the future of Microsoft's enterprise business. There's a lot to report back on, particularly around how Redmond is managing the transition from a predominantly on-premise model to one focused on the cloud.
That makes this TechEd a fascinating event, with a company presenting far-reaching changes in its client and server offerings to an inherently conservative audience of IT professionals. It's not that Microsoft's customers don't want to change - like you they're fascinated by technology and its promises - but on the whole they're working for businesses that move relatively slowly, and that can't take the risks associated with a wholesale transition to the cloud.
So it wasn't surprising that most of the TechEd announcements, including more details about the flagship Cloud Platform System, were focused on Microsoft's hybrid cloud. The hybrid cloud is a key facet of Microsoft's cloud strategy - and where it aims to differentiate itself from its main competitors Amazon and Google.
By letting organizations bridge themselves between on-premises and the cloud, taking advantage of cloud-specific capabilities to enhance their on-premises services, Microsoft believes it can keep its existing customer base while slowly helping them use cloud services where possible.
That's where services like the new Azure Batch come in. It lets businesses use the cloud to handle complex data processing tasks without needing significant code rewrites, and speeds up processing tasks that could otherwise take days.
The Australian arm of Towers Watson, a global financial services risk management company, has been using the service for a while, to speed up stochastic analysis and to handle much larger problem sets. It's probably an ideal customer, one with a well-defined mathematical modelling problem and a large library of existing code that can be used to handle solutions across a massive compute fabric. The demo I saw involved 1,500 tasks running across 1000 cores, and that was a relatively small problem.
While the analysis I saw ran in minutes, Towers Watson is using the service to solve problems in hours that used to take weeks to run. The service increases reliability and, with the service PaaS scheduler which restarts failed jobs automatically, increases speed and allows analysts to run many more scenarios, without having to invest in additional infrastructure.
Azure Batch is an interesting service, one that competes not just with other cloud services, but also with on-premises high performance computing platforms based on GPU processing. However, its key differentiator is that it's a truly general purpose platform, with no need to change your programming model - or write new code for a new platform, as you have to do with OpenCL or CUDA. You can also take advantage of the different Azure instance models in order to get the best balance of speed and cost for your problem.
That's just one hybrid cloud scenario. Others take things in a different directions, as shown in the new Office 365 APIs which let you embed elements of the Office 365 service in your apps. Microsoft CEO Satya Nadella has described it as "Microsoft's most strategic API", bringing Office to more than just the familiar Windows desktop. New Android and iOS APIs bring key Office 365 features to third party apps, including users and groups, files, mail, calendar, and contacts.
Azure's growth and metamorphosis is a key element of this hybrid model, as it continues to blend the IaaS and PaaS models - using the Azure VM Agent as a tool for injecting functionality into VMs on creation. Talking to Azure CTO Mark Russinovich at TechEd, it's clear that that the old boundaries are going to erode as more and more functionality pushes into VMs through the agent.
The Azure VM Agent is managed via the Azure PaaS services, and can best be thought of as an extension bootstrap, letting you install runtime agents as VMs are initialized. While the initial selection of runtime tools is limited, the opening up of the Azure marketplace to services and tools that can be installed by the agent is an interesting move. Russinovich pointed out that it can lead to installing several different features in a VM at the same time - finally removing the fine line between IaaS and PaaS.
Things get more interesting with the inclusion of Docker containers on the Azure platform. As Russinovich emphasizes, containers are orthogonal to the agent model, providing packaging for applications and code. He suggests that containers will provide an effective deployment model for micro-services, allowing rapid scale-out of services by installing an appropriately configured container in each new server instance. As a container is agnostic to content, they're able to deliver either IaaS bits or, more interestingly, a PaaS runtime with a PaaS micro-service. As Russinovich says, "The thing is we've been thinking about this deeply in Azure for a while. We're now seeing it emerge and come out more rapidly."
The Windows version of Docker is going to be native, and while not all the details have been locked down yet, it's under active development. Russinovich notes that the key question is: "How far can we take application compatibility? Applications can be complex, so we need to understand which services need to be virtualized and which can be virtualized easily, a per container view." The easiest to deal with are fully isolated apps; the hardest are those that rely on Windows Services.
One element of Docker that's interesting to Russinovich is its support for stacked virtualization. There you start with a base image, layer images with their own virtual file systems, letting you inject multiple containers and images by using references to base images. It's an option that means you can use differential images to quickly customize existing images without requiring large amounts of storage. The result is a way of quickly deploying composable services by layering on top of a base set of functionality.
Russinovich said: "Azure is driving this, working with the Windows team, implementing in Server vNext. We're working with client, with server, with Azure together." That's where the hybrid aspect of containers comes to the fore, as part of what Microsoft is calling a "converging consistency effort with Azure and Windows Server". With future versions of Windows Server you'll be able to use agents, containers, and other key Azure features on premises as well as in the cloud.
The blending of the cloud with the now traditional world of PCs, servers, phones, and tablets wasn't the only thing Microsoft talked about in Barcelona. Building on its Azure Machine Learning announcements from earlier in the year, it unveiled a series of related services focused on the growing Internet of Things market.
We're used to thinking of thousands, or millions of devices on our networks. As we connect our sensor meshes to the internet, we're suddenly looking at tens, even hundreds of millions of devices, all delivering data to our networks and servers. That's where cloud-based big data solutions come into play, applying their hyperscale lessons to the massive streams of data those sensors will deliver. Azure's new Data Factory and Stream Analytics are PaaS tools that take those streams of data, and help deliver usable results quickly and in familiar applications.
TechEd Europe 2014 clearly marks a watershed in the evolution of Microsoft. It may not be taking its customers on a journey to the cloud as quickly as Adobe, but it's well on the way. With a substantial enterprise install base, a rapid change would be a significant risk, so a slow, steady move to offering value-add hybrid cloud solutions like Azure Batch makes a lot of sense.
With Microsoft's cloud you can go all-in if you want (and the folk in Redmond would dearly love you to make that jump), or you can stay with your on premises hardware and software as long as you want, using the cloud where it makes sense: for backup, for disaster recovery, for cross-platform coding, for large-scale computation, and above all, for the Internet of Things. Interesting times, all round.
Now read this