So it's April Fools' Day and we here at ZDNet are categorically prohibited from making stuff up to fuel the inane echo chamber that is the Internet on April 1.
We've also been cautioned to be careful about what we cover today and into next week, because there's an entire subculture (often coming from temporarily sanity-free denizens at the companies we cover) that seems to derive joy in putting out completely bullcrap stories and hoping some gullible reporter or blogger will pick it up as sooth said.
This, therefore, Ladies and Gentlemen, is not an April Fools' story. It is not a made-up thing designed to give its author a surge of dark side-inspired joy whilst simultaneously annoying and/or alienating the readers who've seen this whole thing before, and know that The Onion does it better.
No, Dear Reader. This is truth. It's just kind of foolish truth. And yes, my editors approved of this story's vaguely April Fools'-related theme before I wrote it. I checked, just to make sure.
Let's step back for a moment and consider our modern technology, in the form of our smartphones and animated zombie Pringles cans. How much innovation, investment, effort, and blood, sweat, and tears did it take for mankind to get from the sundial to the clock on the iPhone 6s Plus?
The cumulative effort is almost impossible to calculate, nay, to even imagine. But we can start to get an idea of its scope by zooming into tiny facets of where we are today.
According to our own 60 Minutes, Charlie Rose reports that it's taking Apple 800 engineers simply to engineer the next generation iPhone camera.
Here's another metric. Back before the turn of the decade, I looked at what went into that Stone Age classic, the iPhone 3Gs. Yes, I'm breaking out the iPhone 3gs because I know exactly where to dig that data up quickly.
- Toshiba provided the 16GB flash memory module
- Samsung provided the application processor for and the SDRAM module
- Infineon provided the telephony chips, the GPS receiver, and the power integrated circuit
- Broadcom provided the wireless networking chips
- Numonyx provided the memory multichip pack Murata provides the finite element array
- Dialog also provided another power circuit
- Cirrus Logic provided the audio codec
We're talking about companies from Japan, Germany, Korea, Switzerland, and even a small, prehistoric, undiscovered bastion of manufacturing workers that still, somehow, survive in America. That's not counting the millions of Chinese workers prevented from committing suicide by nets surrounding their living quarters.
And that was just for the iPhone 3gs. We're about six generations beyond that, now. The parts and nation count has continued to increase.
Adding it all up, we have all these engineers (cumulatively, probably hundreds of thousands more than the mere 800 working on the iPhone camera), millions of workers, countries all across the world, and the end result of innovation since mankind invented fire, and therefore paved the road to Toaster Strudel.
We haven't even talked about some of the amazing technology inside these devices. No, I'm not talking about 4K video or accelerometers, or even WiFi, 4G, or Bluetooth. No, I'm talking about intelligent assistants. Siri and Alexa and Cortana and the let's-just-acknowledge-the-thing-is-a-robot-and-not-give-it-a-name OK Google.
You can ask Siri to tell you the weather. You can ask Alexa not only to add numbers, but tell you the days between dates and when the next primary election will be. You can even ask OK Google to guide you to the nearest Weinerschnitzel (for me, it's about a 2-day ride).
So we're not just talking about the mind-bottling (thank you Will Ferrell) amount of work it took to create all that hardware, we're also talking about mapping the world, launching weather and global positioning satellites, and understanding spoken word speech.
Do you realize that even the lowly clock took millennia to evolve? We couldn't even all agree on having the same time across different towns until it became necessary to prevent railroad trains from crashing into each other.
I'm talking about all of this incredible human innovation because at the end, as the ultimate end result of it all, we have -- drum roll, please -- me.
You see, I am what you might call a "night person." I am a computer scientist by formal education, and a geek by -- well, I don't really know, it certainly didn't come from either side of my family. They're all muggles, normal, with maybe one screen in their living room. Not counting laptops, tablets, or phones, I have ten.
My people, and by my people, I'm clearly not talking about my genetic stock, but rather my tribal affiliation -- programmers -- burst into flame in the sunlight. We cower away from windows (or Windows) during the day, and we code at night, fueled by a potent mix of caffeine, chocolate, pizza, and, of course, toaster strudels.
Unfortunately, I'm also an educator. I have a graduate degree in education, specializing in learning and technology. This means that I sometimes have to interact with the real world during human hours. No, I'm not quite sure how that happened either. I think I once thought I could educate by simply answering questions on online forums at 4am, but, you know, there are these pesky things called "work hours".
Once you start dabbling in the evil that is work hours, you discover something quite disturbing: waking up at a specific time. What happens to programmers as they grow old (like over 30) is they wind up having to do things besides just cutting code. They have to be in meetings. They have to manage people. They sometimes even have to talk to people. And some of this happens in the morning.
No, I'm not talking about morning like you or I might know, that 4am or 5am time that's so quiet and productive, when you transition from Dirty Dozen Brass Band to Cusco. No, I'm talking about that horrid time of the day, muggle morning, around 8, 9, or 10am. Before noon. That kind of morning.
To wake up on time in muggle world, in muggle morning, requires technology. Lots and lots and lots... and lots of technology. Plus, of course, coffee.
And here's where we get to the wonder and foolishness of it all. If I have to wake up on time for a morning meeting, I don't just set an alarm clock. I set Siri. And Cortana. And Ok Google. And Alexa.
I ask each of them to wake me up, usually about five minutes apart. I then often set secondary or tertiary alarms in each of them, so if I tell Alexa to stop, she obeys, but then hits me with another fresh blast of hell 15 minutes later.
If I absolutely, positively have to be at a critical meeting, I start the alarms about two hours early and it can take 8 to 10 of them going off to get me up, plus an hour or so to shower, inject caffeine into my veins, suit up, show up, and remember my name.
It's not an April Fools' story, because it's true. But it's silly enough to qualify for April Fools' while reflective of both the best we have in technology along with the most pedestrian of uses.
It's a fact for a lot of us. When we have meetings that we must be sure to get up for, we set Siri on our iPhones, OK Google on our Android devices, Cortana on our WinPhones, and Alexa on our Echos. Yeah, all of mankind's lofty advancements ganged together in one room, all to replace a $10 alarm clock. Which many of us still own. But never set.
Welcome to April, kids. Enjoy your day. Let us know how you're using your latest and greatest technology to accomplish incredibly mundane or foolish things in the TalkBacks below.