In this age of smartphone zombification, it's hard to believe that there was once a time when most people had never seen or touched a computer. When I grew up in the 1970s, we didn't have a computer. We didn't have any digital devices. We had a rotary phone and a black-and-white, 13-inch Zenith TV (our color 15-inch Sony TV would come in the next decade).
TechRepublic: Bringing back the first computers: The world's greatest restoration projects
Filmstrips (like the infamous "Duck and cover") were still shown in class on actual film. At home, my Dad had a mechanical calculator, complete with the crank, and an IBM Selectric typewriter.
I learned to type in the required-for-graduation typing class, on a fully mechanical typewriter. Fun fact: I failed typing. If I hadn't skipped my senior year and gone to college instead, I would probably have had to take typing again in order to graduate. I know. Irony.
The first computer I ever encountered was a Digital Equipment PDP-8/e mini-computer. Its main CPU was about the size of a microwave oven. It had a 12-bit word size, and the maximum memory (core memory, physical tiny donuts) was 32K 12-bit words. The absolute minimum time it would take to execute an instruction was 1.2 microseconds. It was primitive.
The computer room was at the far end of the second floor in the main hall of Fair Lawn High School, in Fair Lawn, N.J. I remember seeing the computer (which the students named "The Wiz" after the broadway play) and thinking it wasn't all that impressive.
Sitting next to the CPU was an ASR-33 teletype machine. This was kind of like an electronic typewriter, except it was connected to the PDP-8. It also had a paper tape reader attached, like a side car on a motorcycle.
TechRepublic: The computer that helped bring nuclear power to the world
The way you stored and loaded programs was to use the paper tape reader. It punched holes on the paper to store the program, and read the punch tape to put in the program. To actually get the reader to do that, you had to first boot the machine, then toggle in the boot sequence on the front panel, and tell the machine (through a toggle switch) to execute the loader program.
It was all very primitive.
I'll tell you the moment that changed my life. I must have been 13 or 14 years old. Sometime earlier, I had transitioned from wanting to be a cowboy to wanting to be a scientist. I was influenced by the space program and the moon landing, and thought I wanted to go to "science school," whatever that was. But computers just didn't seem all that interesting.
In any case, there I was, sitting in front of the teletype, and my teacher told me to hit the return key. All of a sudden, the teletype came to life.
Bang! R. Bang! E. Bang! A. Bang! D. Bang! Y.
The machine had typed out READY, a greeting many BASIC interpreters of the day presented to users.
I had never encountered interactivity in inanimate objects before. Something clicked in my edge-of-pubescent brain. This was interesting. I was hooked.
Read also: The strange history of smartwatches, in pictures
Sometime later (my memories of my teen years have been, blessedly, mostly excised by the passage of time), I learned the basics of programming. The real basics.
Our teacher brought the limitations of a computer's interpretation of code to life. He would stand at the back of the room and ask a student to guide him to the front. If the student didn't instruct the teacher to walk around a chair or a desk, our teacher would gamely pratfall over the object, to the great amusement (and eventual understanding) of his class.
A year or so later, I went off to college. We had a PDP-10, which used DECwriter dot-matrix terminals. We also ran punch-card jobs. I honestly can't recall if those ran through the DEC 10, or we had another beast that ran batch jobs.
In any case, there were a limited number of terminals for an entire engineering college. While I quickly honed my ability to stay up through the night to gain access to the terminals in the wee hours, I wanted my own computer.
By this time, the summer of 1979, Steve Jobs had introduced the Apple II. There was no way I could afford such a thing. I wanted to build my own. If you were going to build your own in 1979, it was going to be an S-100 bus machine. I decided to build an Altair 8800.
Some of you might recall that the Altair 8800 was considered to be the first mainstream hobby computer available to the masses. It was the computer that two kids, Bill Gates and Paul Allen, quit college for, in order to write a BASIC interpreter.
The Altair 8800 was invented on December 19, 1974, so by 1979, it was almost vintage. I was a very, very broke freshman on summer break. When I found out the Trenton Computer Festival was going to be held, I borrowed my Dad's car and drove down to Trenton (about two hours from my childhood home).
I had scrounged a few bucks and borrowed a few more from my parents. I didn't have nearly enough to buy any kind of computer, but it was a start. Somehow -- I'd never done any sort of sales before, other than selling candy for the Boy Scouts -- I manifested my inner horse trader.
Read also: How these communist-era Apple II clones helped shape central Europe's IT sector
I overheard someone saying he needed a (something, I can't remember now). I then barreled through the show to see if I could find one for sale, went back to the overheard adult, and made some sort of deal.
That day is a blur to me, but by the end of it, in the hot sun, I had scrounged or horse-traded enough basic components to build a very basic Altair. I had a front panel, a bus, a CPU, and a very few 2102 chips on a card, with a staggering 1K of RAM storage.
None of these parts worked together. I mounted the motherboard onto a piece of plywood and over the next month, very carefully soldered hundreds of wires from the front panel to the mobo. Obviously, back then, we didn't have the internet, so I had only a few magazines as a guide.
My parents lived close to New York City, and my Dad commuted to Manhattan every day. I went in with him one day and took busses and subways all around the city until I eventually got ahold of a cassette interface. With that, I could store and load a very basic operating system.
To load that OS, you first had to toggle, bit-by-bit, instructions into the front panel. You needed code that would write data to the tape, then you needed to tell it to write a boot loader to the tape, and once you had that code in the machine, you could load other code.
This was in the days before we had UPS backup batteries. I spent days toggling in that code. By Friday night, I was almost ready to try my first save. I told my parents about it, and told them how important it was to keep power to the computer, uninterrupted, until I finished the whole process. They seemed to understand.
To this day, I don't know why my Dad chose to turn off the circuit-breakers to part of the house that Saturday morning. I lost all that incredibly tedious work and had to start it all over again. By that summer, I got the computer to the point where it would load an actual OS off of the cassette.
Also: Want the full MS-DOS virus experience? Time for a night at the Malware Museum
I brought the machine back to school. I eventually got an 8-inch floppy drive for the thing. That allowed me to load CP/M, the 8-bit OS widely considered to be the precursor to MS-DOS.
That computer, the one I hand wired and hand built, was the machine I used to do my thesis project on. I found CP/M's environment to be limiting, so I got ahold of a reverse-engineered copy of the source code (this was in the days before the DMCA). I replaced the command line interpreter with my own code, which was driven by a very early scripting language that actually had early AI and self-modifying components to it. It was a very sweet hack.
So, there you go. The rest is history. I won the Sigma Xi Research Award in Engineering for the language I built on top of CP/M on that scrounged machine. And, from there on in, it was computer after computer, OS after OS, language after language, until now... when we're at a point where even my bed and light bulbs regularly demand updates. Without those early computers in my life, and, in particular, the excellent and motivating influence of my teachers, I probably wouldn't be here, doing what I'm doing.
Read also: Rebuilding the EDSAC: The project to reconstruct an iconic landmark of computing history
Sometimes, I purposely go back in time to watch period TV shows like Father Brown, Miss Fisher's Murder Mysteries, and Downton Abbey. For a little while, I can immerse myself in a world where there are no home electronics, at least until I once again obsessively check my phone, which is often plugged into the wired USB charger built into my La-Z-Boy recliner.
You can follow my day-to-day project updates on social media. Be sure to follow me on Twitter at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at YouTube.com/DavidGewirtzTV.
Windows 10 revisits 1990: Now you can run Windows 3.0's open-sourced File Manager
Microsoft open-sources its first graphical file management application for Windows.
Inside the project to rebuild the EDSAC, one of the world's first general purpose computers
Volunteers at the National Museum of Computing and operators who used EDSAC almost 70 years ago have come together to replicate it for future generations to learn from.
Apple history: Photos of vast collection stretching back to Jobs' and Wozniak's earliest computers
With an enormous assortment of around 10,000 Apple and Apple-related items, the All About Apple Museum in Italy is about to open its doors to the public.
10 facts about the Apple-1, the machine that made computing history (CNET)
On June 29, 1975, Steve "Woz" Wozniak tested the first prototype of the Apple computer, and history was made.