X
Business

50 years after Basic, most users still can't or won't program anything

When Dartmouth College launched the Basic language 50 years ago, it enabled ordinary users to write code. Millions did. But we've gone backwards since then, and most users now seem unable or unwilling to create so much as a simple macro
Written by Jack Schofield, Contributor

People who got their first taste of IT during the microcomputer boom in the 1970s and 1980s almost certainly started by writing programs in Basic — or, at least, they debugged programs typed in from popular magazines. If you needed to do a complex calculation, and weren't lucky enough to own a copy of Software Arts' pioneering VisiCalc spreadsheet, then you could do it in a few lines of code. You couldn't download an app — most people didn't even have modems — and the web hadn't been invented, so you couldn't use Google or Wolfram Alpha.

When the IBM PC arrived in 1981, the culture of writing simple Basic programs extended to writing simple DOS batch files, to macros in Lotus 1-2-3, and eventually to VBA (Visual Basic for Applications) in Microsoft Excel. I expect some of these macros are still in use, though you probably wish they weren't.

This type of ad hoc programming was the design goal of Dartmouth Basic (Beginner's All-purpose Symbolic Instruction Code), and its greatest success. Professors John Kemeny and Thomas Kurtz created Basic (and a time-sharing system) so that students could use the university computer without having to be trained programmers. And it worked. In what was basically a rural, liberal arts-based Ivy League university, most students used the computer, and most of their programs worked.

The first Basic programs were officially run 50 years ago at 4am on 1 May 1964, when Kemeny and a student programmer typed RUN and started their Basic programs at the same time, to demonstrate both Basic and time-sharing. Dartmouth is celebrating Basic's 50th birthday today (Wednesday, April 30) with several events, including a panel discussion on the future of computing that will be webcast live on Dartmouth’s YouTube channel.

Teletype Model 33 ASR_33 (200 x 242)
Dartmouth students did their Basic programming on Teletype Model 33 terminals, as used later by Bill Gates and Paul Allen. Photo credit:Wikipedia

Attendees will also be able to try an old Model 33 Teletype terminal connected to a Basic emulator designed by Kurtz. (Kemeny died in 1992.) In those days, terminals were more like giant typewriters, and printed your programs on paper rolls. This was a huge advance on 80-column punch cards or even punched paper tape. Later, of course, the Teletype's paper was replaced by a 40- or 80-column character-based screen.

Basic transformed home computing because you could type in what you wanted and get an instant result. If your program didn't work, you could simply retype the offending line. If you used the same line number, the new version replaced the old one. This flexibility made Basic a poor language for real programming, as Edsger Dijkstra (*) and others complained, but that really wasn't what Basic was for.

Basic enjoyed its greatest popularity between 1975 and 1990, thanks to the microcomputer revolution that started with the MITS Altair and Microsoft Basic. This was followed by the Apple II, Commodore Pet, Tandy TRS80 and then the IBM PC. Basic really became ubiquitous with the arrival of cheap home computers such as the Sinclair Spectrum and Commodore 64, and in the UK, the Acorn BBC Micro. It started to fade away after a usable version of Microsoft Windows appeared in 1990, though the Apple Macintosh had already dropped Basic in 1984.

The new world of graphical user interfaces encouraged point-and-click computing. GUIs didn't start users at a command line where they could type things and see the results.

Programmers who started with Basic generally moved on to languages such as Pascal or C, or Perl or PHP, and then perhaps Java. (Or to Microsoft Visual Basic, which isn't really Basic.) That's fine. It's the ordinary users who have lost out. Without having had the experience of coding in Basic, they don't seem to be willing to write small bits of code to solve simple problems.

In my experience, this even applies when no coding is required. You can, for example, perform a lot of simple world processing tasks by recording Word macros and saving them for re-use. I've tried showing people how easy this is, but it doesn't seem to "take". Apparently they'd rather spend hours editing texts manually to remove unwanted spaces and line endings etc. The same thing goes for actions that can easily be automated using the Automator program built into Mac OS X, or the free AutoHotkey for Windows. On the internet, how many ordinary users exploit IFTT?

Maybe your experience has been different — I hope so — but you can let me know below.

Coding has recently become fashionable again, with initiatives such as Code Academy, Code Year, Year of Code etc. As Jeff Atwood has pointed out at Coding Horror, this is not necessarily a good thing (Please Don't Learn to Code). I think a basic understanding of how computer works, and the ability to knock up a quick website, are useful, but not everybody is going to write great business software or make a fortune from world-beating apps.

I'd settle for a rather more modest goal. Today's users have massive amounts of computer power at their disposal, thanks to sales of billions of desktop and laptop PCs, tablets and smartphones. They're all programmable. Users should be able to do just enough programming to make them work the way they want. Is that too much to ask?

* "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration." — Edsger Dijkstra, 1975

 

 

Editorial standards