X
Tech

5 ways technology progressed us in 10 years: The story so far

The role of technology infrastructure in our lives and work isn’t so obvious until you remember what a phone call, a home movie, and a drive across the country were like way, way back in 2010.
Written by Scott Fulton III, Contributor

A decade ago at this time, I was the managing editor of another Web publication, which shall remain nameless but which had originally specialized in "beta news," or news about software in the middle stages of its development. It had become a publication of record about software platforms, which had included operating systems, but which had grown to encompass the Web, and had just then begun to include the emerging concept of the cloud.

I can think of dozens of truly significant stories I've covered in the intervening years, for the myriad of publishers I've served. And, like many others are undeniably doing now, I could arbitrarily choose five, ten, twelve, or some other SEO-significant numeral, as the most important of them all. Yet as a journalist, I've never been an advocate for dispensing history in pre-digested pellets. I loathe enumeration for its own sake. As I've demonstrated for a few years now with ZDNet Scale, what truly matters is a better understanding of where we are in the context of where we've been.

Where we were ten years ago is an inordinately, shockingly unfathomably, different world than we inhabit now. No longer are the platforms that support our work residing on our desks. They're in the data centers and hyperscale facilities connected to us by wireless ether and fiber optic cable.

Today, we are facing cultural questions and unresolved societal issues whose pertinence and relevance no longer relegate them to the back pages and the "tech sections." A trip ten years back through my assignments sheets may as well be the unearthing of a diary chronicling a geographical expedition to an uncharted continent. What strikes me immediately is how even I concluded certain technological stories would eventually become historically relevant, when they ended up as foreign to the present dialog as a dissertation on phrenology to the modern study of human psychology.

I look back at this point in our lives in 2009 -- a year which still, even after ten years, seems like a date on a science fiction calendar -- and I can fathom five ways in which we live and work in a much different place today.

Ask what your server can do for you

01-siri-circa-2010.jpg

A frame from the first 2010 demonstration of the stand-alone iPhone app called Siri, prior to its acquisition by Apple.

Throughout the "double-aught" years, I was often asked -- usually rhetorically, without any expectation of an affirmative or negative response -- didn't I think it grand how much technology has changed our world? It hasn't changed enough, I would respond, before asking my listeners to carefully consider what followed. Then I bellowed into an open space of the room, "Computer, judging from my speech patterns, what is the likelihood that I've caught a cold?"

Folks would take a moment to ponder what it was I intended to demonstrate, besides maybe my unabated appreciation of Star Trek. After a few moments' silence, I would explain that this was my version of the scene in the movie "Oh, God," where John Denver calls God to the witness stand. Take a moment to think, I said, about all the technologies that would enable all of us to hear an answer to that question, that have yet to be considered, let alone created. Didn't you expect, just for a fraction of a moment, a disembodied voice to respond?

As 2020 nears, we actually have made significant progress toward a ubiquitous system of interactive information delivery. The servers, of course, are there. Siri, Alexa, and Google's appropriately named "Google" are all vying to respond to my question, with varying degrees of accuracy, although Microsoft's Cortana may have fallen into the annals of history along with OS/2, "Clippy," and "Bob" by next spring.

So much more than voice recognition is required for anyone to build an industry around responses to arbitrarily bellowed questions. IBM deserves enormous credit for re-elevating artificial intelligence in the public conscience. But as we're still learning, the proper purpose for this AI is to decompose my question into a service request -- one that can be dispatched to an application capable of comparing voice patterns to those it's already learned, to detect evidence that can be linked to infection.

What's happening here is the creation of a kind of semantic ecosystem, one of its most important components being service discovery. Once the translator has interpreted what I want, how does it locate the service that responds to my request, and how does it provide that service with the information it needs to do its job? These are questions that are being addressed one service at a time -- for instance, whenever I ask Waze to steer my car around a potential weather hazard.

Think of how the first circuit-switched telephone networks parsed the audible pulses over a phone line, to determine the correct destination for a call. Now imagine the relative complexity of a service discovery system that parses the correct provider for a lexical request, in an environment where Waze or Android Auto can't make assumptions that such a request is about traffic. This is the scale of the problem being solved now, in leaps and bounds, by the developers of infrastructure platforms and AI.

The workload-centered cloud takes hold

In 2012, I suggested that "the cloud" may become a more appropriate metaphor for providing services than "the Web," if that system could adequately resolve the question of location.  "When Internet users don't have to double-click the blue 'e' to get to their pages or type 'facebook login' into Google to find what they want," I wrote for the old ReadWriteWeb (which also remains nameless today), "any service whose entire viability relies upon Search Engine Optimization as their fountains of revenue will be endangered."

170913-ossummit-02-dan-kohn-cncf.jpg

Cloud Native Computing Foundation Director Dan Kohn explains the role of Docker in history, at a 2017 Open Source Summit.

Scott Fulton

I don't bestow the name "revolution" onto an event unless it causes the revolving of something important, and the turning under of old systems like dead crops. The system, originally called Docker, was created as a means to automatically deploy workloads to a cloud-based infrastructure platform (originally called "dotCloud") without having to wrap a virtual machine (VM) around it first. Containerization was a genuine revolution. It refocused our attention away from the configurations of servers and hosts, and toward the requirements of workloads.

Mobile apps can respond to our voice commands now so much more precisely and meaningfully because containerized microservices on the back end are fulfilling the mission of workload distribution -- a mission conceived as early as the 1960s, in the era of timesharing systems, but that couldn't be achieved in the absence of network bandwidth. MS-DOS, Windows, and Linux attained positions of dominance in their respective workplaces, but not without brutal market battles that left competitors not only defeated but shamed. Kubernetes has achieved its lofty position in cloud platforms in what has, by comparison, been a cakewalk.

Discs vacate the bargain bin for the museum

A dozen years ago, as a managing editor, I had the privilege of deciding what stories fell under my own "beat." Naturally, I chose topics that I believed would have great historical significance in the years to come, for which my byline would be brought back to mind whenever the topic was discussed.

zdnet-samsung-blu-ray-player-ces-2011.jpg

The earth-shaking dispute over whether Blu-ray Disc or HD DVD would reign supreme in the video distribution market occupied far too many sleepless nights. I was enthused beyond reason with the mechanisms with which laser beams were steered and bent through optical scanners, and the resolutions of receiving sensors. If a revision to a format enabled the fenceposts between sectors to be scooted over to make room for 4 or 5 MB more content, I was ecstatic.  (That I remain married is proof that miracles exist.)  Corporate politics was often responsible for determining which format succeeded in a technology market, and the major studios -- Warner Bros., Universal, Disney, 20th Century Fox among them -- were spending billions to fund manufacturers' ability to promote one format over the other. I was in touch with the studios' PR representatives at least as often as I was with Microsoft, Google, and Mozilla.

One day, two of the guest analysts for a series of articles coincidentally offered me the same supposition: If the Internet bandwidth for delivering video straight to consoles should continue to improve (and why wouldn't it?), both Blu-ray and HD DVD could be rendered obsolete by the first streaming service to be trusted by consumers.

They were almost right. The bandwidth was already there; we just didn't realize it yet. What was missing was processing power: The ability for a cloud provider to drive hundreds of thousands of content delivery network (CDN) processes simultaneously. We imagined server farms with tens of thousands of servers processing high-definition video discs on bare metal.

03-adrian-cockcroft-at-dockercon-15.jpg

Netflix engineer (now with Amazon AWS) Adrian Cockcroft, speaking at DockerCon 2015.

Scott Fulton

Microservices resolved this issue. In that sense, Netflix engineer Adrian Cockcroft -- now with Amazon -- may yet be remembered as responsible for this era of broadcasting as David Sarnoff was for network radio. Cockcroft implemented containerization on Amazon's cloud platform on a scale never before imagined. We talk about missing forests for the trees; our obsession with discs was a case of all of us, myself very much included, missing an entire solar system for the hydrogen molecules.

The Web relocates to a lower level

"The problem with today's Internet is that it's dumb, boring, and isolated," declared Forrester Research CEO George F. Colony in 2001, in an article entitled, "The Death of the Web is Inevitable, According to Forrester Research."

Most of the Web's life has been spent dead. After an apparent resuscitation shortly after Colony's comment, it died again in 2005, as Bloor Research chronicled in a brief eulogy for ZDNet, citing the poor prospects for the BeOS operating system and network equipment maker Novell. Then on the eve of the tenth anniversary of the Web's apparent death, Wired Magazine notoriously declared the Web dead.

Human beings have far less object permanence than is generally advertised. Things we lose interest in, such as Taylor Swift, the Affordable Care Act, books (at the hand of e-books), e-books (at the hand of books), and Earth, must no longer be alive, we conclude, because we don't care about them anymore. If it's not top-of-mind, it's six feet under.

The Web, of course, is one application of the Internet, a digital transport mechanism that supports Netflix, Amazon, and according to charts, "others." At the very time of the Web's declared death in 2010, the Web browser had already evolved into the symbol of reconciliation, unity, and friendly competition in the technology and computing industry. For a few brief years, Google, Mozilla, Opera, Apple Safari, and even little old Microsoft pursued excellence in browser performance and efficiency. During that time, they succeeded in re-invigorating interest in the Web as a public resource. Remember the grand old days when the Web was so precious that alterations to it became the object of public protest?

stop-sopa-blackout-screen.jpg

The Web now hosts the transfer protocol for all of the world's cloud-based functionality. To render an API call is to leverage the Web. At about this time last year, Google was among the developers declaring HTTP obsolete. Mid-way through, essentially, the same people made the same declaration. Then last year, that declaration was repeated.

There's a pattern here, and perhaps you're catching on to it. Every time the Web "dies," it moves further into the infrastructure of our networks. While an estimated majority of Web users, on mobile devices and PCs, use either Google Chrome or some browser whose engine is based on the open-source Chromium project, that fact doesn't really matter to them very much. In many cases now, this wasn't by choice. Many of the apps that folks have downloaded to their smartphones and tablets through app stores are client-side browser scripts. They still borrow browser frameworks, and they still execute client-side JavaScript, but people don't think of them as "Chrome" or "Firefox."

"Buried" is not the word that describes the present-day Web.  "Embedded" is. The remnants of the browser have become stronger and more capable than their forebears ever were. And if we no longer perceive the media, these new engines utilize as blogs or pages or sites, perhaps that's all for the better.

Open source finds its value proposition

This is the part of the article where my ZDNet colleague, Steven J. Vaughan-Nichols, gets to gloat. He was the Linux guy during the entire time I was waxing poetic about the wondrous benefits of Windows 7. Yes, I know, Steven, you told me so.

191118-kubecon-023-red-hat-klan.jpg

Employees of Red Hat and contributors to OpenShift gather in San Diego for an open source community event in November 2019.

Scott Fulton

Since the inception of the computing industry, the open-source ideal has been about sharing the creative process -- the very part that's usually considered the center of intellectual property, and the fountain from which all profit springs forth. Up until Red Hat, however, there wasn't an obvious way to build a long-term business plan around it. Even when Red Hat did build a successful organization around support as a premium option, the only other businesses that could replicate Red Hat's success to any degree, were the ones producing their own competitive Linux.

To build an economy around a commodity whose core value is declared free at the outset, there needs to be a consumption route that leads through a turnstile of sorts, passage through which is tied to a profit center. Docker was perfectly suited for this new model. At its nucleus was a free assembler of virtualized functionality. Its urgent need had already been a matter of record; its main function was poorly addressed by its predecessors, if at all; and it prompted organizations to consume cloud-based resources, generating revenue for those resources' hosts.

If only Docker's backers knew the power of what they had soon enough to protect it and prepare themselves for the competitive onslaught. Kubernetes, initially a product of Google, usurped Docker's model and kicked it aside. Now, the Kubernetes orchestrator lies at the center of a burgeoning ecosystem that could very well remain a factor in our lives when I write my next decennial retrospective.

The lesson of open source's viability is not that zero is the best price for intellectual property, or that capitalism is somehow on its last legs, like the subject of a Wired Magazine cover. Open source can be a brilliant capitalistic scheme, in limited circumstances. It freely publishes the component people need to become its consumers, and it profits from consumption in a manner other than compensating for its creation. If only Microsoft had thought of it first, it would certainly have patented it.

But anyone who may think that the open-source ideal has been permanently vindicated by IBM's acquisition of Red Hat should ask themselves whether Java was preserved in perpetuity through Sun Microsystems' acquisition by Oracle... already a decade ago. Perhaps we should remember to check back in 2030. By that time, the Web is sure to have died at least twice.

Editorial standards