GUIs: The computing revolution that turned us into cranky idiots

GUIs: The computing revolution that turned us into cranky idiots

Summary: Change is never easy for users to accept, even if you are introducing new technologies that are supposed to make lives easier or the computing experience simpler.

SHARE:

June 2013 is almost at an end, and we've seen the passing of two major software development conferences, Apple's WWDC and Microsoft's Build, both of which have demonstrated and issued preview releases of three major client/device operating systems, Mac OS X "Mavericks", iOS 7, and Microsoft Windows 8.1.

In all three cases, significant user interface changes have been introduced, and there's been no lack of whining and complaining by certain groups of end users that those changes are for the worse.

In Microsoft's case, the Preview release of Windows 8.1 has generally been heralded as an improvement, since the balance of the radical changes to the operating system occurred in Windows 8, but there will almost certainly be some people who regard any sort of change as negative.

Since the dawn of personal computing, the cycle of resistance to change has been never ending. To understand the nature of this resistance to change, we have to go back to the very beginning.

Xerox_GUI-620px
Xerox Star interface, c.1982.
(Image: Xerox)

The first Graphical User Interface (GUI) to be used in experimental form was Xerox's Alto workstation in 1972, a machine so far ahead of its time that its innovations would not be experienced by end users in the general population until 12 years later, in the form of Apple's original Macintosh computer.

Xerox further developed what it learned about Alto into Star workstation, which was commercially released in 1982, a full two years before the first Macintosh (and the ill-fated Lisa) shipped.

The Star system was so advanced that it could even network with other stations over Ethernet, something that would be practically unheard of until a good five years later, on Novell's Netware 2.x and IBM/Microsoft's LAN Manager platform. Star was so expensive and so poorly marketed, however, that only a very small number of these systems were actually sold.

If you consider that a typical IBM PC cost about $2,000 and a Star was about $16,000 in 1982 dollars, it's easy to understand why this system failed so miserably in the marketplace.

Despite being a commercial failure, what Xerox developed for the Alto and the Star at its Palo Alto Research Center in the early 1970s has stuck with us for 40 years — the idea of a visual "Desktop" paradigm that uses windows, icons, menus, and pointers, also commonly referred to as "WIMP" by user interface designers.

Until the advent of GUIs, PCs, minicomputers, and mainframe terminals were operated by command line interfaces (CLIs). This required memorizing frequently complex syntax to get even the most simple tasks done.

In that time period, PC applications themselves used menu-driven interfaces, but were entirely text based, and the use of things like mice and pointers was practically an alien concept to most people.

While CLIs are still used heavily today, and remain a very powerful tool in modern operating systems such as Windows, UNIX, and Linux, they are primarily used by systems professionals in order to batch automate operations through scripting, and are no longer the primary interface that end users interact with in order to do routine things like move and delete files, or start programs.

Once the Mac made its debut in 1984, there was an utter explosion in the rise of personal computing through the rollout of GUIs on other computing platforms. Microsoft followed with Windows 1.0 in 1985, which was originally an add-on product for MS-DOS.

It should be noted that the introduction of Windows 1.0 was considered a radical change itself on PCs, considering that many users were actually quite accustomed to the command prompt and text-based applications.

Although the Macintosh's release in 1984 was a landmark for the commercialization of the GUI in personal computers, it wasn't until 1990, with the introduction of Windows 3.0, that the concept of the GUI truly began to take hold with the general population of PC end users.

Change is never easy for users to accept, even if you are introducing new technologies that are supposed to make their lives easier or the computing experience simpler.

After Windows 3.0, use of GUIs utterly exploded in computing. Text-based applications made way for graphical-based applications. The need for the end user to understand how the command line worked as well as other arcane things about PCs began to decrease more and more over time, and, as such, the basic competencies required to operate a personal computer also reduced substantially.

The net result? More and more people became comfortable with using computers. And thus, the PC industry exploded.

There were, of course, other attempts to create better GUIs. IBM tried to market OS/2 with its Presentation Manager GUI, which in its second version offered pre-emptive multitasking as well as an object-oriented system.

Many industry pundits and technologists (myself included) believed that OS/2 2.0 was a superior system to Windows 3.x.

However, IBM did a terrible job of marketing the product, and due to the overwhelming penetration of Windows into the PC marketplace, end users didn't bite on it and developers were slow to write applications for it.

Additionally, heavy PC resource requirements to run IBM's "superior" OS optimally (16MB of RAM was expensive in those days, around $600) left OS/2 to become something of a niche system for highly-specialized vertical market applications. Eventually IBM did optimize the software to run well in 4MB of RAM, the standard for Windows PCs at the time, but it was too late.

Steve Jobs, after leaving Apple in 1985, went on to create the NeXT computer, released in 1988, which like OS/2, featured a sophisticated graphical object-oriented operating system.

The systems were extremely expensive ($6,500) and targeted toward a vertical market (education). The company ceased producing hardware in 1993 to focus on operating system software for x86 and UNIX workstations, which it also greatly failed to successfully market as well.

Had it not been for Apple's declining health in the mid-1990s — facing imminent financial ruin with its products growing stagnant and having experienced multiple failed attempts at creating a next-generation operating system of its own — requiring desperate measures on the part of the company to "Think Different", NeXT would have probably also been relegated to the dustbin of history.

Without Apple's intervention in NeXT, Steve Jobs would have been remembered fondly as the guy who co-founded and ran a fledgling computer company in the 1980s, flopped spectacularly with NeXT and broke into the CGI films market with Pixar.

As we all know, NeXT and its intellectual property in the form of the OpenStep operating system, Objective-C development tools and the WebObjects programming framework was purchased in 1995, and became the basis for Mac OS X as well as iOS.

Again, NeXT's failure to win over large groups of end users, and Apple's own unwillingless to take drastic measures to improve its products until it was almost too late, are prime examples of resistance to change.

Of course, overall cost of moving to a new platform or an environment can be a huge limiting factor, as in the case of the Alto, the Star, OS/2 and also NeXT. But in recent times, cost has rarely factored into the equation, as platforms have been highly competitive in price as of late.

And as we have seen even with free and open-source end-user desktop platforms like Ubuntu Linux, and open-source GUIs like GNOME and KDE, absence of cost does not equate to mass adoption.

Once you increase the size of any user population, the more they get used to doing something, the harder it is to move them into the next evolutionary phase. This resistance occurs every single time a UI has to be revamped to address fundamental changes in user behavior or overall improvements in the underlying technology platform.

We saw this, for example, with the introduction of the Start Menu in Windows 95, which was utterly pervasive in Windows until 2012. Mind you, Windows 95 was an extremely successful operating system, but the migration to using the new GUI from Program Manager did not go without its own share of people kicking and screaming along the way.

GUIs have not remained stagnant, of course. With the introduction of mobile computing, smartphones, and tablets, interfaces have needed to change out of the necessity to facilitate the underlying platform, the applications, and use cases.

There are now hundreds of millions of computing devices in the wild, all using GUIs.

Because the form factors are so different from what we have been traditionally using on the desktop and in laptops, the GUI needs to evolve, particularly as we are seeing the inevitable movement toward platform singularity or the "one device".

We are also moving toward the use of devices that need to be optimized for touch and for lower power consumption, and applications that need to maximize the use of screen real estate and use resources in the cloud.

And, almost certainly, along with the use of these inexpensive, lower-power devices, we are also facing the inevitable deployment of complex applications that run remotely in the cloud. Including the most demanding ones traditionally limited to extremely powerful and very expensive workstation hardware.

Because of this, simplification is now the latest trend in UX design changes, across all of the major platforms, which is also attracting its share of resistance and ire.

We live in an interesting time, one in which the previous era in computing intersects with the next, and the most difficult adaptation of end-user computing habits must take place.

The GUI, almost magical in its design, has made computing accessible to all and completely pervasive in our society, but with it brings a curse of resistance to change and a complacency that comes from taking ease of use for granted.

After 40 years of personal computing, the end user has become a simpleton, no longer requiring specialized knowledge to operate the system. In effect, he's been transformed into a cranky idiot who becomes angry when even the smallest change requiring learning new habits is introduced.

Resistance to this change is inevitable. But we all have to go through it eventually, this cranky idiot included.

Has the explosion in GUIs over the last 40 years made change and end-user acceptance more and more difficult through each successive generation? Talk back and let me know.

Topics: Software, Apps, Mobile OS, Smartphones, Tablets, PCs

About

Jason Perlow, Sr. Technology Editor at ZDNet, is a technologist with over two decades of experience integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies. Jason is currently a Partner Technology Strategist with Microsoft Corp. His expressed views do not necessarily represent those of his employer.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

77 comments
Log in or register to join the discussion
  • 40 years?

    "After 40 years of personal computing, the end-user has become a simpleton, no longer requiring specialized knowledge to operate the system. In effect, he's been transformed into a cranky idiot that becomes angry when even the smallest change requiring learning new habits is introduced"

    Meh. actually, I think that pretty much describes the average business user 30 years ago. It didnt take 40 years to get to this point.
    Phil Brown
    • Bah!

      I remember when all we had was the CLI in CP/M and we LIKED IT! It sure beat doing everything on punch cards, which is how most of us interfaced with computers at the time. Things went to hell in a handbasket when they started putting those picture thingies on the screen.

      NOW GET OFF MY LAWN!
      jvitous
    • Agreed.

      The computer is a toaster. No need to change toaster UI for any reason, it will just upset people right?
      Cayble
      • What?

        You can keep your toaster. What people REALLY want is a Toaster OVEN!
        Dr_Zinj
  • GUIs Are A Means To An End, Not An End In Themselves

    GUIs are fine for performing predefined tasks that the system designer thought of; they tend to be less flexible at performing new tasks. For example, composing a new sequence out of existing steps: a GUI forces you to go through all the steps manually every single time, whereas a CLI provides scripting facilities to allow you to invoke the new sequence with a single command.

    The basic question is, who should be doing the repetitive tasks, the human or the computer? The CLI allows you to pass off all your repetitive tasks to the computer.
    ldo17
  • Change?

    Here's my 2 cents, i think the designs changed, not the gui.
    - you present graphical information on the screen which makes it easy for the user to see what something does.
    - you can click on things on a screen, it doesnt matter if its a mouse or finger.
    - you can input text.

    The biggest change lately is the introduction of the swipe, but it is difficult for a gui to graphically present to the user all the swipes available on a screen. So there will be some magic needed by the user.

    The changes microsoft did is more of a variation on the theme of design. The choice of presenting constantly updated information on many different subjects on the screen is more a design choice i think.

    The recent change in ios is also a design choice.

    I think the user should be considered more grown up as these companies think now and should be given the possibility to choose a design without buying a new computing device or getting into a whole new eco system. And for the newbies, that's where the term "default" was introduced for.
    joozzt
  • Jason...

    Didn't Microsoft write OS/2 and IBM bought it? For some reason I remember that.
    CriticalSection
    • No...

      It was a "joint venture" between Microsoft and IBM - at least OS/2 1.x was. OS/2 2.x and later was IBM, though there was some Microsoft code in it. The story is more complicated than that - you can probably find the details on Wikipedia etc...
      MichelR666
  • There is good change and there is not-so-good change.

    When some changes to a GUI (or anything, for that matter) are introduced, you can look at them and say, "Ah, that makes sense. That's an improvement that makes what I want to do easier, or better, or quicker, etc. etc.". In other words, the change is a step forward in the evolution of the thing being changed.
    Then there are changes that make the user look at them and say, "Huh? Why was that necessary? It's not really an improvement." What prompts this type of change? Probably the unending desire of bored consumers for something new and different. Or maybe the desire of the supplier to provide a product that differentiates it from the rest of the pack. Changes made with this motivation CAN be good, but they might also cause consumers to scratch their heads and wonder why they were made at all.
    From the comments I've seen, it appears that Windows 8 is a combination of the two types of changes. People seem to like the fact that you can look at a live tile and get a lot of information at a glance. But the interaction with the interface, with its corner swipes and hidden menus, seems to be less intuitive and more confusing for both non-techies and techies. Yes, you can learn it with a little practice, but it's irritating, and doesn't seem like it's making your life easier or better--even after you've learned it.
    Some users say that other users, the "cranky idiots" who gripe about the changes, are old stick-in-the-muds who don't like to change and don't understand that change leads the way to a bright new future. They should remember that not all changes are the shining step forward they're cracked up to be.
    Userama
    • Mostly Right

      The change isn't consumer driven it marketing driven, they can't convince you buy a new one even if you don't need it, if it doesn't have new bright and shiny bits. It doesn't do anything new but at least is shiny. I have been encouraged to update interfaces with new releases of software even though there was no need. All the magic was on the backend. But we needed to make users feel like they got more at the upgrade. Increased development time and expense for no real reason other than a fresh warm and fuzzy feeling.
      JustAITGuy
  • Human factors the missing piece

    For all the talk about GUI's, the one thing I did not see in the article was human factor testing. Xerox certainly did it, and found that text labels were better than icon's, but somehow we forgot that. Apple did it, and found that reducing the transitions from mouse to keyboard and back improved accuracy and ease of use, but it looks like we forgot that.

    Personally, I can't believe anyone did any human factors testing on Windows 8 or Windows 8.1 on a desktop. It may look cool, but it fails in both of the examples from the early days I used, and in many more obscure studies since then.
    oldsysprog
    • correct, Microsoft did the - let's bank more factor - and only that

      that's all folks
      Deadly Ernest
    • I can remember using Windows for the first time

      There were all these inscrutable icons with no text explanation (and this was before the days of tooltips).

      I had almost no idea what 80% of these icons did. I had a manual, but the only way you can look for an icon in a manual is by thumbing through every page.

      I had been used to a menu driven mini-computer operating system. It was several orders of magnitude more user friendly than windows.
      jorwell
    • Human Factors, hear, hear!

      You hit the nail on the head. Apple did a lot of research into human factors. I spoke with their chief scientist at a symposium at the IBM Almaden Research Center when I worked in the User Ergonomics group there in the early 90's. Apple was doing a lot of human factors research and testing before they put anything in their GUI. Microsoft, IBM, and others did virtually none.
      InspectorGadget
  • Don't agree

    First, computer users have not become simpletons. Simpletons who use computers have been simpletons 40 years ago too.

    Second, I don't see any interface changes in OS X Maverick, dunno why it was even mentioned.

    Most importantly, it was never what the UI is, but it was whether the UI was well designed.
    As a consequence of this, whoever advised Microsoft to adopt the same UI across different platforms and form factors in fact sent them towards the cliff.

    Anyway, everyone is entitled to have an opinion.
    danbi
  • GUI evolution

    The GUI has evolved over 40 years. During that time it has become much easier to use. The GUI of the Windows 8 interface is a major one, and except for tablets and phones it offers no advantages, and at least to my mind is less functional. I have been using Window 8 for some time on my touchscreen laptop. I have drifted into using the desktop nearly entirely and rarely venture to the modern interface. Unless I am using the laptop as a tablet, I see no reason to use the modern interface at all.
    hayneiii@...
  • GUI

    One other thing that GUI's did was to allow the morbidly ignorant masses to tinker with developing software. GUI development environments allow those with no clue about building code to hide all those "nonsense" details about how code is actually built. Never mind when something doesn't work right.
    HackerJ
  • There's change, and there's change

    The change from DOS to Windows was a good and welcome change, but you'll notice that the same DOS interface has been, and is still with windows in the form of CMD.EXE and the command line files we still have.

    Windows 8.1 is still change for the sake of change. They completely miss the point of the start menu, and continue forcing desktop users into a touch screen interface. They should have stuck with the start menu and desktop mode like they stuck with cmd.exe. Microsoft had a chance to make everyone happy,and once again blew it.
    akaltman@...
    • Is MS really forcing anyone to do anything?

      Last I checked the desktop was still on Windows 8. Choice is still there. Choice is still good and no one is forcing you to do anything you don't want to.
      MistaWet
  • Windows 8+ / GUIs

    I used DOS extensively when I migrated my BBS from a Commodore 64 to a PC-XT in 1992 or thereabouts. I then moved to Win 95 in early-1999 when I first got on the internet, upgraded to Win 98 not long after (2001 I believe) so that one of my optical test instruments would work, then to Windows XP shortly thereafter -- and I'm still using XP to this day (though I still use Windows 98 on another computer to operate my beam cross-sectional analyser).

    I still do some things via the DOS command line (sure, sometimes the syntax can be a bit tedious at times, but I find it easier than using a GUI) but much of the time, I'm forced to use the GUI for test equipment such as spectrometers and laser power meters because they cannot be operated via command line.
    The LED Museum