50 years after Basic, most users still can't or won't program anything

50 years after Basic, most users still can't or won't program anything

Summary: When Dartmouth College launched the Basic language 50 years ago, it enabled ordinary users to write code. Millions did. But we've gone backwards since then, and most users now seem unable or unwilling to create so much as a simple macro

SHARE:

People who got their first taste of IT during the microcomputer boom in the 1970s and 1980s almost certainly started by writing programs in Basic — or, at least, they debugged programs typed in from popular magazines. If you needed to do a complex calculation, and weren't lucky enough to own a copy of Software Arts' pioneering VisiCalc spreadsheet, then you could do it in a few lines of code. You couldn't download an app — most people didn't even have modems — and the web hadn't been invented, so you couldn't use Google or Wolfram Alpha.

When the IBM PC arrived in 1981, the culture of writing simple Basic programs extended to writing simple DOS batch files, to macros in Lotus 1-2-3, and eventually to VBA (Visual Basic for Applications) in Microsoft Excel. I expect some of these macros are still in use, though you probably wish they weren't.

This type of ad hoc programming was the design goal of Dartmouth Basic (Beginner's All-purpose Symbolic Instruction Code), and its greatest success. Professors John Kemeny and Thomas Kurtz created Basic (and a time-sharing system) so that students could use the university computer without having to be trained programmers. And it worked. In what was basically a rural, liberal arts-based Ivy League university, most students used the computer, and most of their programs worked.

The first Basic programs were officially run 50 years ago at 4am on 1 May 1964, when Kemeny and a student programmer typed RUN and started their Basic programs at the same time, to demonstrate both Basic and time-sharing. Dartmouth is celebrating Basic's 50th birthday today (Wednesday, April 30) with several events, including a panel discussion on the future of computing that will be webcast live on Dartmouth’s YouTube channel.

Teletype Model 33 ASR_33 (200 x 242)
Dartmouth students did their Basic programming on Teletype Model 33 terminals, as used later by Bill Gates and Paul Allen. Photo credit:Wikipedia

Attendees will also be able to try an old Model 33 Teletype terminal connected to a Basic emulator designed by Kurtz. (Kemeny died in 1992.) In those days, terminals were more like giant typewriters, and printed your programs on paper rolls. This was a huge advance on 80-column punch cards or even punched paper tape. Later, of course, the Teletype's paper was replaced by a 40- or 80-column character-based screen.

Basic transformed home computing because you could type in what you wanted and get an instant result. If your program didn't work, you could simply retype the offending line. If you used the same line number, the new version replaced the old one. This flexibility made Basic a poor language for real programming, as Edsger Dijkstra (*) and others complained, but that really wasn't what Basic was for.

Basic enjoyed its greatest popularity between 1975 and 1990, thanks to the microcomputer revolution that started with the MITS Altair and Microsoft Basic. This was followed by the Apple II, Commodore Pet, Tandy TRS80 and then the IBM PC. Basic really became ubiquitous with the arrival of cheap home computers such as the Sinclair Spectrum and Commodore 64, and in the UK, the Acorn BBC Micro. It started to fade away after a usable version of Microsoft Windows appeared in 1990, though the Apple Macintosh had already dropped Basic in 1984.

The new world of graphical user interfaces encouraged point-and-click computing. GUIs didn't start users at a command line where they could type things and see the results.

Programmers who started with Basic generally moved on to languages such as Pascal or C, or Perl or PHP, and then perhaps Java. (Or to Microsoft Visual Basic, which isn't really Basic.) That's fine. It's the ordinary users who have lost out. Without having had the experience of coding in Basic, they don't seem to be willing to write small bits of code to solve simple problems.

In my experience, this even applies when no coding is required. You can, for example, perform a lot of simple world processing tasks by recording Word macros and saving them for re-use. I've tried showing people how easy this is, but it doesn't seem to "take". Apparently they'd rather spend hours editing texts manually to remove unwanted spaces and line endings etc. The same thing goes for actions that can easily be automated using the Automator program built into Mac OS X, or the free AutoHotkey for Windows. On the internet, how many ordinary users exploit IFTT?

Maybe your experience has been different — I hope so — but you can let me know below.

Coding has recently become fashionable again, with initiatives such as Code Academy, Code Year, Year of Code etc. As Jeff Atwood has pointed out at Coding Horror, this is not necessarily a good thing (Please Don't Learn to Code). I think a basic understanding of how computer works, and the ability to knock up a quick website, are useful, but not everybody is going to write great business software or make a fortune from world-beating apps.

I'd settle for a rather more modest goal. Today's users have massive amounts of computer power at their disposal, thanks to sales of billions of desktop and laptop PCs, tablets and smartphones. They're all programmable. Users should be able to do just enough programming to make them work the way they want. Is that too much to ask?

* "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration." — Edsger Dijkstra, 1975

 

 

Topics: Software Development, Education

Jack Schofield

About Jack Schofield

Jack Schofield spent the 1970s editing photography magazines before becoming editor of an early UK computer magazine, Practical Computing. In 1983, he started writing a weekly computer column for the Guardian, and joined the staff to launch the newspaper's weekly computer supplement in 1985. This section launched the Guardian’s first website and, in 2001, its first real blog. When the printed section was dropped after 25 years and a couple of reincarnations, he felt it was a time for a change....

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

110 comments
Log in or register to join the discussion
  • and, with "apps" to do each thing on a computer

    there will be no one doing much programming, as the very idea of "apps" for a single purpose speaks volumes about laziness, and lack of will.
    chrome_slinky@...
    • And do not forget the fact ....

      ... that once a upon a time, when TVs where colourless and men smoke in the offices, every car driver or owner knew how the motor vehicle engine work and was able to fix any engine problems.

      As vehicles become more advanced, ordinary Homo sapiens started focusing on where they can do better and left the car engine problems car mechanics.

      Likewise, computers moved on. Although there is still chance of using procedural programming and creating an interesting spaghetti code, most of the code that we use this days are so advanced that if we get every Homo sapiens to write from scratch his/her excel spread sheet, calculator, Satnav software, we will end up living in so underdeveloped world.
      Wonder.man
      • Never done any programming I see.

        If you had, you'd know that all the things to speed programming came from humans, including compilers that optimize code. The computers have only recently been able to do anything like programming basic functionality.
        chrome_slinky@...
        • Programming is a trade.

          It's not for everyone. If you're interested, good for you. If you're a professional and making money out of it, even better. But if you don't you're not lazy. You're just into other things.

          Besides, AFAIC, not everyone should a programmer. Even some programmers I know shouldn't be programmers.
          IAmMarty
          • But doesn't hurt for people to dabble

            And if some discover they like programming and are good at it, then so much the better.

            Amateurs are only a threat to professionals that don't deserve to be professionals.
            John L. Ries
      • Technological progress

        I can remember my dad, who passed away 40 years ago today, taking the back off the TV and removing a bunch of vacuum tubes from their sockets. We'd go to a hardware store where he could plug the tubes into a machine one at a time to find out which one(s) had burned out. Buy replacements, go home, and plug them back in, and voila! Your TV works again! That old Zenith black-and-white TV lasted longer than any other electronic device I remember ever having, which is why we didn't have a color TV until some time in the '70s.
        DennisMcCK
      • Technological progress

        I can remember my dad, who passed away 40 years ago today, taking the back off the TV and removing a bunch of vacuum tubes from their sockets. We'd go to a hardware store where he could plug the tubes into a machine one at a time to find out which one(s) had burned out. Buy replacements, go home, and plug them back in, and voila! Your TV works again! That old Zenith black-and-white TV lasted longer than any other electronic device I remember ever having, which is why we didn't have a color TV until some time in the '70s.
        DennisMcCK
      • Technological progress

        I can remember my dad, who passed away 40 years ago today, taking the back off the TV and removing a bunch of vacuum tubes from their sockets. We'd go to a hardware store where he could plug the tubes into a machine one at a time to find out which one(s) had burned out. Buy replacements, go home, and plug them back in, and voila! Your TV works again! That old Zenith black-and-white TV lasted longer than any other electronic device I remember ever having, which is why we didn't have a color TV until some time in the '70s.
        DennisMcCK
    • Don't underestimate the virtue of

      laziness. A lot of the modern conveniences you enjoy today are the result of someone wanting to be lazy.
      baggins_z
      • Hence, the Apple Mac and iPad users...

        ;)
        adornoe@...
      • laziness???

        ...nothing about wanting to do MORE in LESS time?
        fm.usa
  • The "ordinary users" of 50 to 30 years ago

    aren't like the ordinary users of now. Those were all highly technical users who might have found reference counting a bit tough, but would not find procedural scripting difficult.

    Not a lot like today where ordinary users don't even know they're on a computer ("but it's a tablet!")
    Mac_PC_FenceSitter
  • We actually want ordinary people to program...

    ...as it's the easiest way to teach people how computers work. But they won't be working in C/C++, nor should they (they're tricky even for professionals to program in). Free Pascal would probably work better, as would Perl or Python, but what's really needed is a cheap to free, good quality modern BASIC interpreter (we don't need no stinkin' GOSUBs).

    One of the things that drove the decline in programming among ordinary PC users was MS' decision to release Visual BASIC as a commercial product instead of as a standard part of Windows. I understand the economics behind it (and I'm not claiming MS was wrong to do it), but a BASIC interpreter had been standard equipment on personal computers for nearly 20 years (in large part because MS was licensing its BASIC to PC manufacturers) and QBASIC was a DOS toy (not terribly appealing to users accustomed to GUIs).
    John L. Ries
    • Why?

      Once they grasp the concept of pointers, and having to do a bit more policing of one's coding, there is really little else that differentiates C from Pascal.
      chrome_slinky@...
      • Much easier to hang yourself with C

        Indeed, that appeared to be a selling point to C, once upon a time.
        John L. Ries
    • @ John L. Ries

      "....but what's really needed is a cheap to free, good quality modern BASIC interpreter (we don't need no stinkin' GOSUBs)."

      There are many free BASIC interpreters out there. My favorites are Michael Haardt's traditional Bas (runs on Linux) and Decimal BASIC (Windows, Linux). Decimal BASIC is ISO standard and line numbers are optional so you can write complete programs with no GOTO/GOSUB in sight. Even error handling is done with no branching. There's a companion compiler for it that compiles to Lazarus Pascal (hidden from user) and from there to pure machine code.
      toml_12953
  • When you lack mathematics...

    Anything involving a "function" is too complex.

    And macros are a "function" applied to documents.
    jessepollard
    • @ jessepollard

      I have to disagree. Turtle graphics have shown that even very young elementary school kids can grasp the concept of writing a function and using it to write other functions. They have only arithmetic skills, no higher math but have written some amazing programs. Functions can be presented in a non-mathematical way as a black box. You put something in this side, it churns out something else on that side.
      toml_12953
      • Anyone can scribble on a wall too...

        But that doesn't make them a commercial artist.

        Without understanding, you get very little results.

        Without the discipline that the mathematics brings you get a lot of bugs... that can't be fixed...
        jessepollard
        • Programming is one of the ways to learn that discipline

          It doesn't mean that beginners are going to produce anything worth paying for any more than beginning musicians do, but formal training isn't always required, and completion thereof is not always an indicator of one's abilities (which is why auditions are much better indicators of qualification than resumes).
          John L. Ries