'The cloud is just someone else's computer', runs the joke. But if you're saying that, the joke is on you, because it means you don't understand what the cloud actually is.
So many people misuse the word 'cloud' that you can be forgiven for thinking 'in the cloud' means 'over an internet connection'.
It doesn't. 'The cloud' means something very specific.
The simplest definition of cloud is a data centre that's full of identical hardware that no-one ever touches except to unpack it on day one and throw it away when it fails; in between, every deployment, update, investigation, and management process is automated.
The unit of compute and storage in cloud isn't a server or even a cluster; it's a stamp, because you 'stamp' them out as identical units.
With Azure, for example, the smallest stamp size is 800 to a 1,000 of what you can call servers (20 of which are running the management software). Some cloud infrastructures distribute storage throughout the same boxes that have the CPUs in, while some cluster it in storage array. Still others put a battery in every box instead of having a UPS.
The network routers in a stamp might be more 'servers' running networking software, or they might be dedicated routers controlled by the same management system as the rest of the stamp. It's a completely different level of abstraction from the way we think about servers, from how the stamp is deployed and managed, to how workloads run in the cloud infrastructure.
You don't look at the firmware level of the RAID card; you deploy the same setup to 100,000 servers and if one of them doesn't work, you throw it away (eventually, anyway -- a failed motherboard or hard drive could sit in the stamp for a month or even a year with the management system routing around it).
You do that with commodity hardware rather than fault-tolerant redundant hardware to make it cheap enough to do it at scale, and because doing it at scale means you don't need expensive hardware to achieve redundancy, you can do it in software.
Software-defined would be a better term than cloud for much of this: software-defined networking, software-defined storage, software-defined computing.
Cloud comes from network diagrams of old where architects were showing an abstraction, either a distant data centre or the fact that you didn't have to care what the network protocol was going to be. It's used to refer both to the infrastructure of identical systems managed at scale by automation, and the 'someone else does the setup and management' services that run on that infrastructure, where 'someone' is an automated system as well.
What about the few people that do work on that cloud service doing anything other than writing software and automating systems? They've been through much more careful security screening than the people working with the computers you own, unless you're in the habit of taking fingerprints at job interviews and running background checks.
If you're using a PaaS service like storage, or an 'as a service' option that gives you software like Hadoop or Jenkins, or a data warehouse that would be complex and time-consuming to set up at the click of a checkbox, tell me how much you care about not having the computer running it be something you own and have to manage.
The 'someone else's computer' crack can mean a few things. It can mean that someone distrusts the cloud because they can't fondle the switches and caress the levers and change the settings themselves, and they don't know enough about hyperscale cloud to understand why that's a good thing. It can mean that someone does know very well what cloud means but they're fed up of dealing with people who don't understand that, and want to remind them the cloud is made up of computers, that the laws of physics still apply (maybe you need to care about network latency and whether your storage and your computing is in the same place) and that if you need to care about regulation that you need to pick a cloud service that meets those regulations.
It's just that saying 'it's full of servers' doesn't do anything to educate people about how to get the most out of cloud by designing apps and workloads to use cloud patterns and take advantage of cloud strengths. In fact if you think of cloud as 'just servers', you're far more likely not to get the benefits of cloud.
The VM image you run 'in the cloud' might be special and unique, if you're using IaaS as a cheap way of doing virtualisation -- but don't mistake that for using a cloud service. The cloud service is the IaaS platform that rolls out your image whenever you need it, migrates it to another server if there's a problem, turns it back off when you don't need it, and counts up what resources you used and are going to get charged for.
If whatever is running on your VM has a single point of failure, or a flaw like a log file that fills up every day, putting it on a cloud IaaS server won't fix that any more than hosting it in a third-party data centre with good old-fashioned virtualisation would. But you can use the time and money you save not administering the physical server it used to run on to fix whatever the real problem is.
So all of this means the cloud is 'a hyperscale, automated computer farm run by someone who's better at automation and security than you, and can buy electricity and servers and network connectivity more cheaply than you because they buy so much of it and if you want all the benefits of cloud you have to design things to achieve that'. Not that catchy, so I can see why it hasn't taken off as a tag line.
It's not just cloud that gets this treatment: the latest 'I don't understand this new technology' confusion is about 'serverless' computing. How can it be serverless, people joke, when there's obviously a server running it?
Serverless computing is like driverless cars; there's still a driver, just not one you can see. You don't need to ask if the driver of the driverless car is hungry or tired or drunk or needs to stop for a bathroom break. If we ever let driverless cars drive on our roads it will be because we don't have to care about the driver, just the fact that they will take us where we want to do. Serverless computing -- where you write and run code on a cloud service without caring about the hardware or operating system of the computer it runs on -- might not be the term for this, but it's a good enough description.
You don't care about the server your AWS Lambda or Azure Functions code runs on; you don't care if it's taking a five minute break between being a website or running a Monte Carlo simulation or if it's a brand new server. You don't care if it's running RedHat or Ubuntu or Windows Server, or how much RAM it has or when it was last patched.
You just care that Amazon and Azure and the other serverless providers are so efficient at deploying images to the servers in their cloud and monitoring them to make sure they're healthy, that it's not worth charging you for the work of making the image available -- just the computing you do on it, using an interface that abstracts away everything about the server that you don't need.
This is software-defined computing that's a step beyond virtualisation; the CPU that runs the computation is a fungible resource that's allocated by software, at whatever level of abstraction you want. If you want a server to run your VM, you can have that. If you want to run PHP, Node or .NET code without caring where or how it runs, you can do that. If you want to stream sensor data from IoT devices or index a website or do any other arbitrary computation, you don't need a server to do that -- you just need the compute power.
Again, 'don't care about the server-computing' is more accurate and less catchy. So, feel free to keep misunderstanding and mocking the term. Just don't be upset when your competitors take advantage of it to save money, build better systems, and get happier customers. The joke's on them for not caring about the server, right?