Computing's low-cost, Cloud-centric future is not Science Fiction

Computing's low-cost, Cloud-centric future is not Science Fiction

Summary: The future of personal computing is one that is cloud-centric, and utilizes inexpensive, power-efficient and disposable mobile, desktop and set-top devices.

SHARE:

blade-runner-cityscape-620px

In early 2009 I wrote an article called "I've seen the future of computing: It's a screen." It was a an almost Sci-Fi sort of peice, projecting what I thought the personal computing experience might resemble ten years into the future, in 2019, based on the latest industry trends at the time. It was the second of such pieces, the first of which I wrote in 2008. 

In May of 2011 I also wrote another speculative piece about what I thought personal computers would be like in the year 2019. 

Late last year, I imagined another speculative and futuristic scene, portraying the shift towards ecommerce and the fall of brick and mortar retail shopping. 

Futurist thought exercises such as these are always fun, but inevitably, with any sort of long-range predictions of the future, there are things which are very easy to miss and get so wrong that you fall flat on your face. Futurism never gets everything right, but sometimes it can also be dead-on and flat out uncanny in its accuracy.

Excellent examples of these complete miss and "holy crap, were they right!" type of predictions can be found in classic Sci-Fi movies like Stanley Kubrick's 2001: A Space Odyssey (1968) and Ridley Scott's Blade Runner (1982).

In 2001, Kubrick is way ahead of his time in his depictions of manned and commercial space travel and the colonization of the moon, as well as true artificial intelligence, things which are probably at least several decades away. Still, the technologies to accomplish such feats are definitely within our reach if the world's governments can cooperate and establish clear goals to achieve them.  

But in 2001, Kubrick also shows working tablet computers as well as personal video conferencing, technologies which have only recently become more commonplace. In 1968, when the film was first released, the forerunner to the Internet, ARPANET was still being developed at the US Department of Defense, so the concept of a world wide connected computer network that was accessible to the average Joe, let alone the military or academia was not yet a part of the common Sci-Fi vernacular.

Scott's Blade Runner, like Kubrick's 2001, is also very much ahead of its time. Dystopian Futurism is one of the huge themes of the film, depicting flying cars, giant overpopulated and polluted cities with towering 100 story buildings, and genetic engineering gone out of control. 30 years after the film's release, it is still considered to be a SF masterpiece.

We don't have flying cars yet. However, personal computers and mobile devices are everywhere in 2012, something we see a lot of in Blade Runner. And yet, the entire concept of PCs and an entire industry dedicated to them was brand new when the screenplay was in development and being filmed.

And while we don't have the ability to create genetically engineered, custom-made "replicants" of animals and human beings as it is depicted in the movie, life sciences have made tremendous advancements since the film was first released, such as the ability to map the human genome and create genetically modified organisms for commercial use. The cloning of plants and some animals is now common practice in modern agribusiness.

The massive influence of globalization, Japanese culture and its multinational corporations in Blade Runner on American society is a huge part of the film's futurism, and is something that we take very much for granted today.

Little did we know in 1982 when that film was released that not only would Japan surpass the United States as a producer of electronic goods, but in 30 years, it would eventualy cede its position as the world's most dominant producer of technology to China, Korea and other Asian as well as South American countries.

My 2009 "It's a Screen" article describes a future centered around the Cloud, server-based computing, virtualization, low-power and low-cost ARM-based devices. The iPhone's App Store at the time was a less than a year old and Android was only beginning to gain market traction with the recent release of the Motorola Droid on Verizon. The iPad was a year away and certainly, there were no Android tablets.

In 2009, the idea of consumer driven Cloud-based content consumption and data storage was still very much in its infancy. At the end of 2012, we now have our choice of Cloud content and storage providers with Google, Apple, Amazon and now Microsoft, with the launch of Windows 8, Windows Phone 8, Windows RT and the Windows Store.

In 2009 the Chrome browser itself was less than a year old, and the idea of a Chromebook was probably just a twinkle in Google's eye.

This week, Google and Samsung released a new $250 version of the Chromebook. There's not much new or even innovative about this particular device, particularly from a UX standpoint -- it's the same Chrome OS we've seen before, except that it now runs on very inexpensive, ARM-based hardware.

Also Read:

So it's understandable that a lot of prominent tech industry bloggers have received this with a yawn, especially since we're only a few days away from the likely introduction of the iPad Mini, as well as Microsoft's Windows 8 OS and Windows RT tablets, not to mention another Google Android event which is rumored to be the launch of several new Nexus-branded Android devices.

Granted, theres a lot more sexy and pure "wow" factor in these devices than the new Chromebook.

But that doesn't make the launch of the new Chromebook any less important than a newer, smaller iPad or the next generation of Microsoft's OS.

The significance of this particlular product release isn't about UX and Apps. It's about bringing down the cost of of computing down to affordable, almost disposable levels, and leveraging the Cloud to do the heavy lifting which we've otherwise relied upon much more powerful personal computers to do before.

For the past 30 years, the driving force behind personal computing been Moore's law -- the quest to double microprocessor density and performance every 12-18 months.

What the Chromebook truly represents is moving away from Moore's Law as the key metric in the development of personal computing.

While the price of personal computers in the last 30 years has gone down substantially, and they have become exponentially more powerful, they are still considered to be expensive and inaccessible to many people, particularly in poor urban and rural population centers in the United States, other parts of the Western world, as well as in developing countries.

They are also prone to quick obsolescence, require regular software maintenance, can be expensive to repair and require considerable support infrastructure. And with the state of the current world economy the quality and longevity and reliability of computer components has also declined substantially. To use the old adage, they definitely "don't make them like they used to."

So what the Chromebook truly represents is moving away from Moore's Law as the key metric in the development of personal computing.

Instead of doubling processor speed on a yearly basis as the prime goal of the semiconductor industry, we're going to see processor and component integration increase on a yearly basis, moving away from expensive laptops and desktops with many components requiring complex and expensive manufacturing processes to single-chip systems which are are much more power efficient, cheaper to produce and cost under $300 retail. 

That Samsung can produce the new Chromebook, using its own processor foundries, its very own manufactured RAM and flash storage and is own display technology is a massive vertical integration achievement in and of itself and should not be ignored.

It's something that even Apple, which is a veritable master of supply chain managment, still does not have the capability to do.

So what will computing's future really look like in 2019? Who will be the leaders and what will be the predominant platform? I'm not going to say that it's going to be Apple, Google, Microsoft, or some other player which may emerge. From a platform standpoint, the playing field is still wide open. It would not surprise me at all if every single one of those companies carves out a distinctive niche for their respective ecosystems, acting as feudal lords over their Cloud estates.

What I can tell you is that the rules have changed, and the balance of power has shifted -- to empower the user, in a far cheaper, much more ubiquitous and much more centralized way than we ever imagined.

Topics: Cloud, Apple, Google, Microsoft, Mobile OS, Smartphones, Tablets, PCs

About

Jason Perlow, Sr. Technology Editor at ZDNet, is a technologist with over two decades of experience integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies. Jason is currently a Partner Technology Strategist with Microsoft Corp. His expressed views do not necessarily represent those of his employer.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

16 comments
Log in or register to join the discussion
  • This difference between movies and reality is....

    that in movies, voice input works flawlessly.
    Userama
  • Quite the opposite!

    "What I can tell you is that the rules have changed, and the balance of power has shifted -- to empower the user, in a far cheaper, much more ubiquitous and much more centralized way than we ever imagined."

    I disagree completely: the user is losing power as we move to tablets and cloud computing:

    - devices like the iPad and Surface remove hardware choice and induce vendor lock in and all the attendant restrictions of global corporates

    - dependence on the cloud services of the same global corporates will, like the monopoly effect, tend to make it extremely difficult for a customer to switch

    - public cloud dependence will allow corporates to control pricing, retaining cost benefits due to technology advances instead of passing them on to customers (see for example MSFT's subscription for Office 2013)

    While I am totally in favour of cloud principles ... the incumbent corporates are attempting something that has never been achieved in the last 5 industrial revolutions: in previous cycles the incumbent corporations either went out of business or were marginalised. The likes of MSFT and APPL will arrange things to ensure not only their survival but extend their profit ... when the cost of computing should be dropping by orders of magnitude.

    I'd say JP has it completely wrong.
    jacksonjohn
    • Total agreement

      I am in agreement with johnfenjackson. Could not have said it better myself.
      JP is on a cloud somewhere !!!
      shearer@...
    • "corporates"?

      "incumbent"?

      Forced verbiage aside, I agree with you
      HypnoToad72
  • The future of video conferencing & my opinions on why it's not popular yet.

    Speaking of future tech, in order for video conferencing to really become mainstream, one key requirement still must be met.

    People need to interact with the person they are talking with. For the most part, that involves direct eye contact. In my experience, direct interactive eye contact doesn't occur on a desktop monitor. (web cams are placed in a computer setup where it is convenient for the design engineer to do so, not where it would do the most functional good.)

    Here is what needs to be done to fix that. Somehow, the video camera needs to be directly "behind" the eyes of the person being displayed on the monitor screen and not viewable to the other person or persons speaking to the display video image. There have been a few patent ideas discussed that utilize "transparent" display monitor tech to accomplish that goal.

    How that tech can be modified or optimized to allow for multiple locations on a display monitor where the video conferencing "window" is located is - well - I haven't a clue as to how that could be accomplished yet.
    kenosha77a
  • We were using cloud (of a sort) computing in the 1970s

    on mainframe computers with time-sharing operating systems where input was from terminals connected over dedicated lines (example, IBM 360 series mainframe with 3270 terminals). The "cloud" in those days was on-site disk storage hooked to the mainframe, and was secure against data loss if regular backups were being performed. We were more or less locked in to a single vendor for the mainframe, terminals, and disk drives, although some mainframe clones, as well as compatible terminals and disk drives existed).
    oldnuke69
  • The future is in devices that work anytime, anywhere.

    The future is in devices that work anytime, anywhere.

    And don't break when you lose your wireless connection.

    Which is actually how it's turning out.

    Most of the apps on my iPhone work that way - they cache everything and run locally as needed, and sync when there's a connection.

    So - why would I give up the ability to work anytime, anywhere?
    CobraA1
    • So having a personal life or privacy

      Means nothing to you?
      Happy neoslavry.
      HypnoToad72
  • Cloud is nothing more than Internet servers rebranded

    Rebranded why? Because calling it a set top box system would infuriate the Internet users.
    ggibson1
  • Still overpriced and useless

    The chromebook at any price is still underpowered, almost useless laptop. It still won't sell because people can buy a true laptop with 320 gig hard disk, full OS and the ability to run a lot more software then is available to a browser. And for the same price. And don't quote me the $1000 plus price of some laptops. There are many available for under $300.

    You say, "The significance of this particlular product release isn't about UX and Apps. It's about bringing down the cost of of computing down to affordable, almost disposable levels..."

    First off, the word is spelled "particular." Secondly, the chromebook may be part of decreasing hardware, but it is not a sign of decreasing costs for users. What all this rush to the cloud is really doing, is changing the way we pay for software. No longer will you be able to make a one time purchase of a software package then use it for as long as you want and have equipment to do so. Instead, the software and server companies want you to pay them a monthly fee to use the software. That's a monthly fee forever, please note. It has been demonstrated in many articles that SaaS will result in considerably increased revenue for the software companies. Who pays that increase? We do.

    What if everyone who purchased Windows XP and are still using it were forced to pay a monthly fee? By today, they will have paid an incredibly greater amount of money than they did. That's the future that MS, Adobe, Apple, etc. want for you.

    Hardware may be coming down and more powerful, but users will be still be paying for their computer usage. Factor in the additional costs of bandwidth, slower processing through the internet, vendor lock in, down time when you can't use your clouds, etc. and the future doesn't look as rosy as you paint it.

    Have a nice day,

    Doc
    Doc.Savage
    • Well said

      The new normal of "freedom" is increasingly interesting
      HypnoToad72
  • Mvoing computing away from the user works in some situations...

    ... but nto all.

    I think you are probably right that the future of business computing is, ironically, back to the data center business user's escaped from with their desktop computers.

    In case anyone has forgotten this is really a move "backwards into the future." When computers were big, slow, expensive machines they were shared by many remote users who all used comparatively dumb terminals. The height of that technology was the IBM 3270 terminal that downloaded a form description from the mainframe, displayed it an let the user fill out the fields, and then uploaded it when done (sound familiar to anyone?)

    The fact of the matter is that business are not democracies and the democratization of computing in them has caused no end of administrative grief for those responsible for keeping the business infrastructure running. The big difference between the old main frame and the current "cloud" is that the internet has allowed the administration of that big machine to be totally off-loaded to an external service provider. Whether that is a good thing for companies I think still has to be proven. It simplifies life at the cost of dependance on external providers and less control over security.

    Such remote computing works well for business applications because their latency requirements are minimal and their bandwidth requirements are tolerable. They also frankly only require a fraction of the power of a modern host making host-sharign economically advantageous once again.

    This is not true however for all kinds of applications. Media production and game playing are two examples of application spaces where much greater CPU power is required actign on much bigger data sets with much tighter latency requirements.

    The rub here is that game playing has driven the purchase of personal computing since the days of DOS and FLight Simulator 1. While someone may be willing to work with a "work only" device at the office, it is doubtful they will gain much traction in the rest of people's lives.
    jeffpk
  • The source of the ideas is . . .

    Arthur C. Clarke wrote 2001: A Space Odyssey.
    Philip K. Dick wrote Blade Runner.
    Kubrick and Scott are very talented visualizers, whereas Clarke and Dick were arguably visionaries.
    Credit where credit is due.
    stantonreg
  • Engagedots CRM

    There is always demand for cloud computing. It has reached its peak.
    Engagedots CRM
  • why still the "screen"?

    I don't really understand why you believe people will still need a personal "screen." Why not smart glass everywhere and the internet of things, where all I carry around is something like a memory stick to plug in anywhere I want to interface? This scenario seems just as likely.
    jhuik
    • Why not a transplant in you brain!

      Mentally turn it on and off and be able to see everything in your head like a heads up display. ie overlaying your normal vision or replacing depending on the situation!

      Were will it end?
      martin_js