Their headlong pursuit has saddled cloud computing with so many misconceptions that it is sometimes difficult for customers to make informed business choices. ZDNet UK has looked at the most common myths, and debunks five of them here.
Myth 1: Cloud equals SaaS, grid and utility computing
The term 'cloud computing' has been hijacked by anyone wanting to make a service sound hip and interesting. Jumping on the latest bandwagon is a favorite pastime in the technology industry, but in this case it is creating confusion among customers, who are unsure what they should be asking for or what they're likely to get for their money.
So to clarify: cloud computing is a form of outsourcing by which vendors supply computing services to lots of customers over the internet. These services can range from applications, such as customer relationship management, to infrastructure, such as storage and the provision of development platforms.
The services are provided by massively scalable datacenters running hundreds of thousands of CPUs as a single compute engine, using virtualization technology. That approach means workloads are distributed across multiple machines — which can also be located in multiple datacenters — and capacity can be allocated or scaled back according to a customer's needs.
Moreover, because applications are multi-tenant in nature — multiple instances of the same package that can be executed on the same machine — system resources can be shared among a large pool of users, which reduces costs.
Software-as-a-service (SaaS) is one of cloud's most recognized manifestations, but strictly speaking it is a subset of the category because not all offerings are massively scalable. This means that, while some SaaS offerings are cloud, cloud is not SaaS.
Grid computing, on the other hand, comprises a networked cluster of heterogeneous, loosely coupled machines. These machines work together to solve a single, generally scientific or technical problem that is either CPU-intensive or requires access to large amounts of data.
The clusters may be located either within a single organization or form part of a larger-scale public collaboration between many organizations, but each machine runs software that apportions pieces of a program to be processed. This means grid does not provide customers with an outsourced service but rather a way of clustering disparate machines to harness idle compute power.
Utility computing, meanwhile, refers to the packaging of computing resources as metered or subscription services in a similar fashion to those provided by water or gas companies. As a result, cloud can be purchased in a utility computing format, but the one is not predicated on the other.
Myth 2: Cloud computing will take over the world
According to some pundits, it would appear the end of the IT department is imminent. Apparently, IT directors will be throwing out every scrap of kit as soon as possible and rushing to the new model.
But to inject a note of reality, such a cavalier, rip-and-replace approach has never come to pass before, no matter what the hype-mongers have said, and it is unlikely to happen now. Compare the situation with that of client-server in the 1990s — despite commentators widely proclaiming the death of the mainframe, it still has its place and is alive and kicking in many IT departments to this day.
Tom Austin, head of software research at analyst firm Gartner, believes that people in the 1990s "had a fundamental misunderstanding of how business, technology and economics work, and anyone saying that cloud is going to kill the IT department is engaging in the same logical errors".
While he acknowledges the importance of the model and what "a fatal mistake" it would be to ignore the advantages it offers in boosting staff productivity and cutting costs, Austin also indicates that the cloud is likely to act as more of a complement than a replacement to in-house systems. "You can't ignore it, but also don't fall in love with it because, like everything, it has its place. The death of the IT department is greatly exaggerated," he says.
This means that, rather than going for a blanket approach either way, IT directors will need to understand what cloud services make most competitive sense.
Because most IT departments are overwhelmed with work, much of which is routine and mundane rather than strategic and innovative, Jonathan Yarmis, vice president of disruptive technology at AMR Research, recommends outsourcing "low-value things" or "the commodity elements of business operations" to cloud specialists that can do a better job more cheaply. Such activities include commodity application access, email or content archiving.
Since most organizations are likely to adopt some elements of cloud computing over the next five years, the role of the IT director is expected to change accordingly. This change means they will become orchestrators both of internal and external services rather than gatekeepers.
Myth 3: You can use competing cloud services
The market for cloud services is still only in its infancy, and adoption has consisted of limited or departmental trials. One of the inhibitors to further uptake is the piecemeal nature of the provider market.
Jonathan Yarmis of AMR Research says whatever cloud purists may claim: "For users looking at this stuff today, the last thing they want is to deal with 23 vendors each providing a small portion of the cloud and then being left to integrate everything,"
Apart from the likely management and support headaches such an approach would entail, the problem is that providers may, for example, use a range of back-end databases that "end up with weird burdens when you put them together". Another issue is that back-end applications may be upgraded at different times, leading to system and device incompatibilities. Therefore, Yarmis says: "People want one head on a plate."
However, as the market matures, it is likely that such challenges will sort themselves out. On the one hand, as vendor consolidation inevitably takes place, offerings will broaden out and become less niche. On the other, large service providers such as IBM are likely to spot the advantage of becoming aggregators for their own and third-party services, sorting out many of the integration and management issues in the process.
Myth 4: Flick a switch and your IT shifts to the cloud
The reason why cloud services are comparatively cheap is they are not customized for individual clients, which allows providers to take advantage of economies of scale to reduce costs. While this situation may change as the market becomes more sophisticated, organizations will experience problems finding bespoke or vertical market-specific services at the moment.
Another challenge is that many IT departments still have some way to go in moving to an IT services-based approach themselves or towards simplifying their infrastructure to such an extent that elements can be outsourced to cloud providers in a straightforward fashion.
In a storage context, for example, Jon Collins, service director at consultancy Freeform Dynamics, explains that organizations have been struggling to introduce the concept of tiered infrastructure for years, which is the principle behind information lifecycle management.
The problem is that too few firms have worked out "which data should go where from a business point of view". This failing means moving everything to the cloud would not provide any real benefits as the fundamentals have not been sorted out. It would merely "create a new set of network dependencies because the data is no longer in the same datacenter", Collins says. He adds that such a move would, moreover, be a "massive business risk" for any beyond the smallest of organizations because such services are "untested, unready, overhyped and don't come with any guarantees" or service-level agreements.
Another concern is security. While vendors will tell you that they have the resources to make their datacenters more secure than you ever could and that resilience is better because data is distributed and backed up in geographically dispersed locations, that is not entirely the point.
For some organizations, including those in the public sector, distributing data around the world can cause problems for risk management and regulatory compliance. Some enterprises may also have an issue with sharing the same equipment with rivals, even if application instances are quite separate.
Other risks to be managed include sorting out intellectual-property rights, establishing adequate security controls, including appropriate staff-vetting procedures, and working out what happens if suppliers go out of business.
Myth 5: Switching cloud vendors is easy
Although technically there are ways of switching data between one cloud provider and another, few vendors have come up with formalized procedures or guarantees about how they will do it, which is a concern if customers decide a given service is not right for them.
Another challenge, however, is simply the amount of traffic the internet can handle, as evidenced by complaints from ISPs about the amount of bandwidth being consumed by the BBC's iPlayer video service.
"The internet is not an infinite resource so as soon as you want to do a big movement of data, things get very sticky. Just moving a couple of gigabytes onto a USB stick can take 10 minutes, but doing it over the internet will take much longer even with a high-bandwidth connection," Jon Collins of Freeform Dynamics says.
As a result, he points out, one cloud provider offering an archiving service is forced to load storage disks onto trucks and aircraft rather than transfer data online because there is simply not enough capacity to do so. But this scenario also raises the question of whether there is sufficient bandwidth to cope at the moment with a large number of organizations accessing IT services over the internet.
Collins thinks not. "Realistically, very few organizations would bet their entire company on the web and are unlikely to do so in the next 10 years," he concludes.