In 1968, Apollo 8 was originally designed to perform Lunar Module testing in low Earth orbit, but production failures were found. Instead, given that the Command/Service Module was ready for flight, engineers proposed flying a human around the moon -- and history was made.
In 1969, the Atomic Energy Commission, now the Nuclear Regulatory Commission, granted the first license for manufacturers to sell smoke detectors for home use. You may not realize it, but each smoke detector contains a very small amount of radioactive material, and thus it wasn't until the AEC approved home use that smoke detectors could be used to keep families safe.
According to the National Fire Protection Administration, the risk of dying in a home fire is cut in half in homes with working smoke alarms.
Runner up: Apollo 11 moon landing.
The Pocketronic shares billing with the Sanyo ICC-0081 and the Sharp QT-8B for bringing pocket calculators to students, engineers, and scientists the world over. The innovation then, as it is still today in our portable devices, was the ability to produce low-powered chips and reliable rechargeable batteries in a form-factor that allowed for portability.
The 4004 ran at a tenth of a megahertz, but it was huge in terms of its impact. The 4004 was the first true microprocessor, and like much of modern technology, it didn't begin that way.
Intel (who was just another company back then) was contracted by Japanese calculator maker Busicom to build a chip to reduce the costs of their calculator. Instead of merely developing a chip set specifically for use in one machine, the 4004 turned out to be a very early general purpose programmable chip, capable of much more than mere basic math. And the rest, as they say, is history.
Runner up: PhoneMate Model 400 answering machine, the first commercial answering machine.
In 1972, interactive games with vastly more primitive graphics only existed in laboratories, attached to giant computers. That is, until Pong. Pong was the first commercially successful video game, paving the way for an industry that's bigger than movies and music combined.
1973 was the year the Xerox Alto, the first cell phone call, TCP, Ethernet, and fiber optics were created. Together, all of those technologies have, in combination, informed the world we're in today. That makes 1973 a tough choice.
The year award could go to the Alto, because it demonstrated the graphical UI we all know so well. But what about cell phones? Smartphones have beaten out the PC and Mac for digital dominance, going well beyond the Alto's UI innovations. TCP, Ethernet, and fiber optics are what make the Internet possible, and without the Internet, where would we be?
Maybe writing this ten years ago the answer would be different. But today, the mobile smartphone is the dominant technology, bar none. And it all started in 1973.
Although barcodes were patented back in 1952, the same year that Alan Turing defined his Turing test, commercial use didn't take place until 1974. That's because laser technology necessary to do the barcode scanning didn't exist until the 1970s.
The big day was June 26, 1974. It was about 8am on a drizzly, foggy Wednesday in Troy, Ohio when Marsh supermarket cashier Sharon Buchanan took a pack of Wrigley's Juicy Fruit Gum from Clyde Dawson and ran it through her newly installed barcode scanner. For the first time anywhere, a barcode had been scanned to determine a product price, and retailing and supply chains the world over would never be the same.
In some ways, the Altair 8800 was just another kit offered to geeky hobbyists. But it changed the world. The Altair 8800 was promoted on the cover of Popular Electronics and became the first commercially successful microprocessor-based computer.
If that were all, though, we might give the award to Betamax. After all, home video is a big business today. But the Altair didn't just usher in the personal computer business, it gave birth to Microsoft. It was Bill Gates and Paul Allen who wrote the original BASIC interpreter for the 8800, and then went off to create Microsoft. You know that story. We all know that story. And the Altair 8800 was the first chapter.
Runner up: Betamax.
For 1976, we're giving the award to a company, not a technology. It could be argued that a business is really code, its DNA driven by the personalities, values, and innovative insights of its founders.
That's certainly the case with Apple, which was founded by the late Steve Jobs, Steve Wozniak, and a rather unfortunate Ronald Wayne, who sold his founder's equity stake in the world's most valuable company for $800.
Runner up: The Apple I, which didn't change the world nearly as much as its company and founders.
Now, it's time for an Apple computer to make our list. The Apple II not only changed the face of personal computing, it ushered in other key transformations, like Visicalc, the first commercial spreadsheet program.
While computerized mailing lists had existed for years on the Arpanet, it wasn't until 1978 that the precursor of our modern day forums and BBSs was created. It was during the Great Blizzard of 1978 that Ward Christensen and Randy Suess were stuck at home. Christensen had already created the canonical MODEM protocol for file transfer (you may know it today as XMODEM).
Bored, with nothing better to do, Christensen and Suess created CBBS, a dial-in forum system that became the first BBS. There was only one phone line, so each participant had to wait until a previous user hung up to gain access.
The importance of BBSs can both be mocked and celebrated. Without BBSs, we probably wouldn't have as many trolls as we do online. But BBSs were also the first social networks, a way for consumers and interested parties to gather together, magnify their influence, and share information.
It's hard to believe now, but the idea of private, personal music didn't exist before 1979. Either you played albums on your home stereo or blared the music from cassette tapes to the entire neighborhood on your boombox.
But in 1979, when Sony introduced the Walkman, you could finally listen to your choice of music, in private. This reduced family bickering, made it possible for workers to listen to their own music on the job (when appropriate, of course), and gave a lift to the entire music business.
Runner up: McDonald's Happy Meal, because -- behind pizza and Chinese food -- McDonald's fuels American innovation where it counts.
I'm giving the nod for 1980's innovation of the year to a failed project. ENQUIRE was a project developed in 1980 by Tim Berners-Lee while he was at CERN, and, in many ways, can be considered a concept prototype for the Web.
ENQUIRE was a bit more like a cross between HyperCard and a wiki, and required central maintenance. Even so, it was Berners-Lee's first run at the use of hypertext for group communication and information organization. Because of the centralized maintenance required, ENQUIRE wasn't really accessible to other users. The original ENQUIRE software disk has been lost to time.
If it weren't for how totally the web has transformed our world, we wouldn't have given 1980 to ENQUIRE. But even as an early prototype, if it moved the needle that would knit the web, it had amazing impact.
Runners up: Pac-Man, one of the most popular video games of all time; The Empire Strikes Back, arguably the best of the Star Wars series on the movie that kept the franchise alive; the Microsoft Z-80 SoftCard, a device that brought CP/M business programs to the Apple II; and the Timex Sinclair ZX80, one of the first very-low-cost personal computers.
Which had the most impact on society: Two decades of MS-DOS compatible computing, or the first truly portable personal computer? Go ahead and argue that down in the comments.
Both the original IBM PC and MS-DOS were chosen as a pair, because those two products together created the incredibly vibrant desktop PC market that dominated computing well into the late 1990s -- and then spawned Windows, which dominated until the early 2010s.
Runner up: The Osborne 1. While the Osborne 1 was the first truly portable business computer, it ran CP/M, an older-generation OS compared to MS-DOS. It was also a pain to use, with two under-capacity floppies and constant swapping of disks. The Osborne 1 existed, and for some die-hard purchasers, it was mission critical, but it didn't demonstrate the ultimate utility of notebooks and laptops.
What happens when you introduce a product that's way cheaper and outperforms the market leaders? If you back it with good marketing and a smart production process, it takes the world by storm. That was the story of the Commodore 64, introduced to the world at $595, about a third of the cost of an Apple II at the time, and well less than an IBM PC.
At one point, the Guinness Book of World Records listed the C64 as the best-selling computer of all time. Key to the machine's success was better-than-expected graphics capability and a sound chip that made electronic music production possible for home computer buyers.
Earlier, we spoke about how the pairing of MS-DOS and the IBM PC created a dynasty. But it was two years later, when Lotus 1-2-3 was introduced, that what became known, simply as "the PC" became unassailable.
Lotus 1-2-3 was the PC's killer app. It was much faster than VisiCalc, had better graphics, macros, and combined features that previously required users to leave one program, swap floppies, and launch another. VisiCalc was often touted as the reason businesses bought Apple IIs. But when Lotus 1-2-3 was introduced by Lotus, it knocked VisiCalc off its business use pedestal, and the Apple II along with it.
Floppies were a pain, and when IBM introduced the PC XT, with a built-in hard drive, some of that pain went away. As you might imagine with an IBM machine, there were a lot of configuration options.
That was a huge expense for business, but the combination of Lotus 1-2-3 and the hard drive-based XT was so compelling, businesses by the thousands bought both. That's a killer app.
It was, for many years, a failed product. Built under a cloud of strife and abuse under Steve Jobs' sharp tongue and biting ridicule,the Macintosh held but a fraction of the exploding PC market.
But it changed everything. It took more than a decade, but the dominant computing UI, which we still use at work to this day, was the windows and mouse model created by Xerox and pioneered by Apple. Users of the 1984 Macintosh would identify and be able to use the 2018 Macintosh, for the basics defined as far back as 1984 are still in use today.
Further, the UI pioneering and Jobs' brutal attention to detail gave birth to the modern smartphone, and that, too, changed everything.
It was the year of PageMaker and the laser printer. It was the year of 2400 baud modems and the Amiga 1000. It was the year of the Sony Discman and the first CDs. All these products had their impact on the future. But if there's one product we all identify with, and to this day seek to add to our retro collections, it's the Nintendo Entertainment System (NES).
The NES rescued the videogame industry. It introduced us to Mario. It brought console gaming back into our homes to stay, for good.
In 1986, the space shuttle Challenger exploded. Voyager 2 made it to the planet Uranus. The Soviet Union (still a thing back then) launched the Mir space station. All of these were technological milestones. But did they change us?
Not nearly as much as LISTSERV, the automated first email list management system. LISTSERV allowed bulk email sends, allowed users to subscribe and unsubscribe, and vastly extended the reach of conversations online. Eric Thomas took the original LISTSERV concept and automated its functions, thereby giving legs to the early dial-up forum concept of the BBS. LISTSERV made the world just a little bit smaller and brought us all a little bit closer.
It's hard to overstate the level of buzzApple's HyperCard caused when it was first announced and demonstrated. For the first time, ordinary users were able to create astonishingly deep graphics-based applications. I started my first company around HyperCard, and for the time, before a new set of Apple managers forgot why it was created and relegated it to the tomb that was Claris, the power of user-created content took off.
HyperCard, though, was the seed for so much. It was the seed for the first wiki and, eventually, a more complete web prototype from Tim Berners-Lee. It was the seed for deeper multimedia apps on CD-ROM, and HyperCard stacks were the forerunners of today's smartphone apps. HyperCard ultimately failed, hung out to dry by an Apple that didn't then value user-created content. Even though HyperCard lived a life cut short, it changed the world.
Photoshop was not the first image manipulation tool, but it was the first to have the right combination of capabilities, extensibility, and marketing push. There are so many things Photoshop does and has made possible over the years that we could devote an entire series to it.
Photoshop, though, is our winner for 1988, because even now, 30 years after its introduction, most graphics professionals -- including your writer -- could not imagine a workflow that does not include Photoshop.
Tim Berners-Lee invented the World Wide Web in 1989. Even so, we're giving this year's nod to the launch of the first GPS satellite, and saving the web for 1990, when the first web browser was created.
GPS is transformative, and impacts millions of peoples' lives every day. When my wife and I evacuated Hurricane Irma, we didn't turn to maps. Instead, we turned on our GPS and safely followed its comforting instructions from Florida all the way to Oregon. GPS keeps people on track, helps manage and track goods and services, and gets us all home safely. It's hard to imagine a time when we didn't have eyes in the sky, guiding us all.
Of all the technologies that changed our lives, perhaps the most profound of the last 50 years has been the web. But it wasn't the ability to hyperlink documents that made the most impact. Instead, it was the application that presented all that information to users, the browser.
The browser, in combination with the various web protocols, allowed access to the web from a wide variety of operating systems and devices. It allowed untrained users to click and browse from website to website. But even before there were public websites, there needed to be a browser.
That browser was initially called WorldWideWeb. It's name was later changed to Nexus to avoid confusion with the entity we now call the web, but back then was the World Wide Web or WWW. The web changed the world, but it was the browser that delivered those changes worldwide.
Runner up: Windows 3.0.
We're now into the 1990s and technology change is accelerating. The first website went online at CERN. In fact, so much happened that we have a few articles devoted to 1991 alone. But of all the innovations, of all the products launched, one stands out: Linux.
But it was the message sent out on August 25, 1991 to the Minix Usenet newsgroup that changed everything. Linus Torvalds typed, "I'm doing a (free) operating system (just a hobby, won't be big and professional..." Ah, Linus. You got so much right, but you got the scale of Linux' eventual impact so very wrong.
Linux took UNIX and blasted it out of existence. Instead of a very expensive-to-license operating system, Linux was free. It fired up open source. And today, Linux runs in everything, from light bulbs to cars, to almost all TVs and phones on the market.
By 1993, things were heating up for the World Wide Web, which was quickly becoming actually worldwide. While Mosaic wasn't the first browser, it was the first that could display images. For the time, it was very fast, and it quickly became popular.
Mosaic, created by Marc Andreessen and Eric Bina, grad students at the National Center for Supercomputing Applications (NCSA) located at the University of Illinois Urbana-Champaign. Mosaic eventually became Netscape, which dominated the web (for a while, at least).
Runners up: Windows NT, Myst, DOOM, plus the first webcam to improve caffeine intake efficiency (technically, the Internet of Things was born here, as well, and, as it should be, it was all because of coffee).
Who would have thought that people would prefer typing over talking on their phones? While the SMS concept had existed for quite some time, it wasn't until December 3, 1992 that engineer Neil Papworth sent a message to Richard Jarvis' Vodafone Orbitel 901 handset. The message that precipitated billions of very sore thumbs was a simple "MERRY CHRISTMAS".
At the top of its usage curve, US cell phone customers sent 2.3 trillion SMS messages. But as this chart from Statistica shows, SMS volume has been going down steadily as users migrate to app-based message from Apple, WhatsApp, and Facebook. Even so, SMS changed how we talk, or rather, not talk to each other.
At the time of its founding back in 1994, no one could have know that Amazon would become one of the world's most innovative companies. Then, it was a source for books.
Today, it's at the core of the cloud movement, has played a primary role in killing off retail (or at least beating retailers who weren't on their best game), has revolutionized digital books, transformed product availability and delivery, created an AI that lives in our homes, and has become a prime producer of top-tier original video content.
By 1995, Windows had been around for a full decade. But it was in 1995 that what became the dominant desktop environment for the next two decades would be introduced. While a new Windows 10 user or Mac OS user might not know how to use Windows 3.1 on sight, every modern desktop computing user would know how to use Windows 95.
Windows 95 was the first version of Windows to include IE, which would become the dominant browser for more than a decade. While network configuration in Windows 95 was still uncomfortable, with Windows 95, Microsoft finally had the foundation for what would become the modern desktop experience.
At the time, it was hard to believe a modem company would introduce the first successful handheld PDA. Now, of course, with handheld smartphones dominant, it's impossible to separate communications from personal devices.
1996 also gave birth to the USB and CSS. These have had their impact on technology, but it was the small, portable, relatively inexpensive Pilot handheld that replaced personal organizers and was the first device, since the watch, that came with us everywhere.
A lot went on in 1997, but the single biggest event, arguably the one that changed all of technology, was the return of Steve Jobs to Apple.
You have to remember that in 1997, Apple was dying. It was always described as "the beleaguered Apple Computer" or "the troubled Apple Computer." No one would have expected Apple to utterly transform music and telephones, not to mention lead the digital mobile transformation we're experiencing now.
One more thing: It could be argued that other companies would have created mobile devices, but it was the force of Jobs' personality and his steadfastness of purpose that overcame the impenetrable blocades and old style of business practiced by mobile operators. Sure, we would have had smartphones. But smartphones would not be what they are, the dominant technology worldwide.
If you're not sure about the impact of Google on modern times, Google it. For the early years of the web, search engine wars dominated the news. Then came the Google algorithm, famous for surfacing much more relevant information.
Somehow, a page that was simple and barebones eclipsed all other advertising, determined what was relevant to... everything, and became the dominant information verb in our lives. Founded with the motto "Don't be evil," it's not at all clear whether Google will be our constant assistant and friend, or our ultimate undoing.
Apple has a habit of taking existing technologies and molding them into something irresistible to consumers. Along the way, Apple has often set the pace, effectively giving other companies "permission" to enter similar markets.
While neither the 1999 Airport Wi-Fi access point nor the easy-to-mock clamshell design of the Apple iBook were barnburners, they showcased one feature that has changed computing. Before the AirPort (and Wi-Fi), computers were always tethered. If you wanted to access a network, you had to plug in. But with the advent of Wi-Fi, we could take our machines anywhere in the home or office, without wires.
The AirPort showed it was possible, and the entire world followed.
It's not hard to see the impact AdWords had on the online advertising industry, but one thing is for sure: nothing has been the same. AdWords took the risk out of advertising, at least mostly.
Instead of buying an ad for a period of time and paying the fee, advertisers could buy a certain level of performance in terms of click-throughs. But it was also up to the advertiser to properly construct their ads, with better-performing ads rising to the top. This is a huge business. By 2017, Google's ad revenue was nearly $100 billion.
We continue to look at products that laid the foundation for the modern world. Windows XP and OS X (now macOS) 10.0 were both released in 2001, and served as the foundation for our current desktop operating systems.
But it was the iPod that continued the tech world's inexorable move to a mobile-first environment. There had been many MP3 players before the iPad, and, in fact, Apple promoted its own music format. But the iPod was introduced with, for then, was such shocking capacity that, for the first time, music lovers could carry their entire music collection with them wherever they went.
Tor, based originally on an onion router project developed for the US Navy, is designed to keep communications secure, even at a level that may surpass VPNs. The idea of an onion router is that there are layers of security (like layers in an onion) that would have to be peeled away to find out a user's identity. Since Tor transmits through a series of IP addresses, the destination IP address will never know that of the originating IP.
In a world where privacy is becoming ever more difficult to secure, where governments, terrorists, and criminals are actively spying on users everywhere, a tool to protect privacy becomes ever more important. Unfortunately, like many technologies, privacy provided to the innocent can also be used by bad guys. Even so, the non-profit Tor project exists to preserve and protect identities the world over.
Tor, itself, may not have changed the world as much as something like Android did. But Tor enabled the world changers to work safely and freely to change the world, and that's its ultimate contribution.
Most people think of the Android operating system as something Google developed, but that's not the whole story. Android was founded as a company, initially intended to build a operating system for digital cameras. At one point, the company was so close to closing down, it couldn't pay its rent.
That was then. This is now. Today, Android is the most successful (in terms of the number of users) operating system in history. It is, unfortunately, fragmented almost beyond recognition, and suffers from many security concerns and forks. Even so, Android is dominant numerically, and will likely remain so for years.
n addition to Facebook, the company Mark Zuckerberg founded in 2004 as TheFacebook owns Instagram, WhatsApp, and Facebook Messenger. Together this juggernaut dominates messaging and social media to a degree never before seen.
Not only has Facebook transformed how people connect and communicate, it's also created its own vast walled garden, filled with details about nearly every human on the planet. How it uses that data, how it manipulates that data, and how it protects that data will be a problem for all of us for years to come.
In 2005, it was very difficult and expensive to distribute video. I did some videos for clients and the challenges and costs were enormous. All that changed when YouTube made internet video free for everyone.
According to CMO.com, consumers are 27 times (not percent, times!) more likely to click through a video ad than through a standard banner. That, alone, should rock you back and get your attention.
According to Google (who owns YouTube), more 18 to 49 year olds watch YouTube video on mobile than any broadcast network. Google also says that same demographic group dropped TV watching by 4 percent, but in 2015, increased YouTube watch time by 74 percent.
What can you say about Twitter in 140 characters? #TurnsOut #YouCanSayALot.
Although Twitter upped its character count to 280 last year, the micro-blogging service created a new way to reach a tremendous number of people, instantly. Perhaps nothing showcases Twitter's power more than Donald Trump's unexpected and improbable rise to President of the United States. By using Twitter, #TheDonald bypassed all the gatekeepers and built his own audience of dedicated fans.
Whether or not you think a direct connection to the brain of a president is a good idea for the republic, @realDonaldTrump disintermediates all the norms of presidential communication, and connects #MAGA fans to their leader.
The iPhone. It was rumored and anticipated for years, but when Steve Jobs finally held it up to show it, it still exceeded everyone's expectations. The thing was, it wasn't just the iPhonethat blew the PC, music, landline, and cell phone markets apart. It was the apps, which took another year.
Once Apple introduced the App Store, and created a way for users to get at apps for a few bucks and the touch of a button, the last bit of friction between digital technology and digital technology use was gone -- and the world was forever changed.
The techie in me would like to give this year to Google Chrome, Windows Server 2008, or Hyper-V, because all were impressive, influential products. But the charter for this list is technologies that changed us, andAirbnb is impacting housing, hotels, towns, and cities the world over.
What seemed like a simple sharing economy way of letting folks let rooms in their houses has become a worldwide phenomenon, causing civil governments all over the planet to rethink their approach to zoning and land use. It's not all good, with Airbnb blamed for rising rents and the reduction in the availability of rental properties. Even so, Airbnb gets our nod, because it's like nothing that has come before.
As our list of runners up for 2009 show, a lot of innovation happened in 2009. But we're giving our nod to the first Fitbit because it helped kick off the quantified self movement with a device with no subscription fees and a full week of battery life.
Although Fitbit has a raft of competitors today, most notably the Apple Watch, the idea of gathering data on personal activity to help drive health and fitness has been gaining traction ever since that first Fitbit. With the graying of the population, the increased cost of healthcare, and the need for us all to get healthy, the quantified self may be a way for us to manage our way to better health.
The idea of a handheld, gesture-based tablet computer had been around for years. It wasn't until Apple, a company normally associated with high-ticket items, introduced the iPad that the consumer tablet market took off.
The original iPad came to market at an affordable $499 base price. It was simple, understandable, reliable, and -- for the time -- fast and responsive. Although the tablet market has mostly been consumed by larger form-factor phones and attacked by the no-setup-required Chromebooks, it's clear that the iPad and tablet computing helped break the dominance of the desktop PC, particularly among consumers.
In giving the nod to Chromebook, two trends have to be considered for 2011: IoT and smart homes, and the breaking of the Microsoft and Windows hegemony. 2011 marked the release of the first Nest Thermostat along with a lot of other smart home devices. Smart homes are growing as a trend, but they're not yet transformative.
On the other hand, Microsoft and Windows had a stranglehold on computing for more than two decades. The rise of the smartphone changed all that, but so did the Chromebook. Initially considered little more than an amusement because all it ran was the Chrome browser, the Chromebook has taken education by storm. Because of the growth of the cloud, the Chromebook is demonstrating that, really, you can do almost anything you need to do with a powerful browser and no native apps.
Runner up: Nest Learning Thermostat.
Ever since humanity discovered how to make tools, there have been makers. But the ability to add advanced computing power to projects was limited by the cost of entry. TheRaspberry Pi changed all that. Here was a $25 device that could run Linux and be at the heart of a vast array of projects.
It's hard to believe it's been five years since the latest console generation was introduced. That said, games for the Xbox One and the PS4 have eclipsed those of previous generations, providing what has become almost a new golden age of video games.
This is also the first game generation to fully embrace 4K TVs and, in the more advanced models, better HDR image quality. Nintendo, which earlier made a big splash with the Wii, would skip most of this generation with the subsequent failure of the Wii U. Nintendo stays off the field until 2017, when it launched the Switch.
By late 2013, Microsoft was rapidly becoming a has-been in the minds of many users and analysts. Windows 8 was a total failure. Microsoft was late to the smartphone party and Windows Phone was a dismal failure. The acquisition of Nokia was insanely expensive and ultimately fruitless. Microsoft had lost all its luster.
But then came two events: Satya Nadella took over Microsoft from Steve Ballmer on February 4, 2014, and Windows 10 was announced on September 30, 2014. Prior to Nadella, Microsoft only had two leaders, Bill Gates from 1975 to 1999 and Steve Ballmer from 2000 to 2014. The computer industry of 2014 was a very, very different beast from that of 2000, and Ballmer seemed mired in old school thinking.
Since then, Microsoft has been firing on all cylinders. It has opened up with apps on competing devices. It has launched its own line of competitive computers. It has planted its flag in the cloud space with the hugely successful Office 365 and Azure offerings. It has even embraced Linux alongside of Windows. And, finally, Windows 10 is a clear success.
It's important to understand that Alexa is what Siri should have been. Alexa is smart, fast, personable -- and has a huge library of apps, called Skills. Amazon has been smart, allowing other vendors to license and embed the Alexa technology in their products.
As a result, the voice-based personal assistant, which is also the core of a home-based IoT hub, is now a practical aspect of everyday life.
So, that happened. You may have heard Microsoft talking about the HoloLens. You may have seen Apple's keynote, where they talked about the potential for AR (augmented reality). But for millions of people, AR is already here... in the form of a ridiculous computer game/experience.
This odd little game, where you chase after animated monsters you view in meatspace through your phone's screen, has been downloaded more than 750 million times as of last year and has generated more than $1.2 billion in revenue. More to the point, however, is that it has exposed a vast range of the technology-using planet to the concept of augmented reality.
Nope, we're not going to give the iPhone X the nod for 2017. Sure, it changed up the iPhone formula a bit, but the jury is still out whether it's a winner or a flop.
Instead, we're going to award 2017 to an unlikely player, Nintendo. Like Microsoft, Nintendo has shown us that it's possible for previous leaders who've lost their mojo to find their way back to the top. The Nintendo Switch is a surprising combination of home console and portable machine, with Nintendo's exceptional game design and the right price.
When Apple announced the Apple Watch Series 4, it not only created a compelling reason to buy into the Apple Watch ecosystem, it created a compelling reason to buy a watch, period. What sets the Series 4 watch apart from its predecessors -- and what makes it the technology that changed us for 2018 -- are the Series 4 health features.
The Apple Watch has long included heart rate monitoring. The Series 4 introduced single-lead EKG. The fact that Apple got the go ahead to introduce this feature from the FDA is a first for a consumer device. Another stand-out feature is fall detection, which activates either manually or recommends itself to older users. For an aging, yet technically astute population, the Apple Watch Series 4 took wearables a big step forward not only for fans of the quantified self, but for those concerned about their health and wellbeing.
We haven't showcased a lot of technology that's still got enormous potential, but hasn't yet rocked the entire world. Stay tuned for vastly improved drone technology, along with a fight over whether or not drones are intruding on our privacy. Look for VR and ARto take hold, as price, performance, and the ever-present nausea are conquered by developers. AI and intelligent assistants, along with commercial and personal IoT will be growing at a tremendous pace. Enterprise computing, the cloud, and the distributed office will be a trend that keeps on giving.
But there's also a dark side. Privacy will continue to be assaulted, both by criminals and our own governments. Hackers and identity thieves will be rampant. Social networks will sacrifice our safety for their own reasons, possibly changing the outcome of world governments. And the proliferation of real fake news, scandals around every corner, and politician on politician battles will keep us all cranky and stressed out.
As you have seen, over the last 50 years, technology has empowered us, but it has also come with a price. As we look towards our next 50 years, we need to keep in mind both the benefits of rapidly improving and advancing technology as well as the increasingly troubling behaviors of those producing them, legislating their use, and using them.
Stay tuned to CNET and ZDNet. We'll be covering the world of technological change, every day, and in every way. It'll be a heck of a ride, but we'll be there, right along with you.