Isn't there an easier way to do this? If asking the question demonstrates our intelligence, it's finding the answer that proves our usability expertise. More intuitive use, a more enjoyable experience and improved productivity — in other words, getting more stuff done with less pain — are goals that predate the computer age by quite a long time...
250,000 BC Tool use
Banging, chipping, shaving, hewing, shaping, stirring, piercing, reaching... tool use has to be our first great usability breakthrough. The ability to recognise that the devices and materials lying around can be adapted to help us achieve our goals was evidence of our understanding of causality ("If I do this, that happens"), a major advance in human cognitive capabilities. Building and using tools pushed us straight to the top of the food chain, where we've been ever since, carrying out usability evaluations of our tools.
900 BC The alphabet
Smoke signals good, email better. Leap forward a few years from spearheads, and we'd cracked language and even begun to develop complex systems of symbology to represent meaning. Written communication was emerging, as long as everyone in the tribe understood the same thing from the same symbols.
These proprietary systems became increasingly limited as we sought to trade and communicate with other tribes; and they were cumbersome to learn and expensive to maintain (sound familiar?). The alphabet, courtesy of the Phoenicians, was the first open system. Based on graphemes (letters) rather than symbolic representations of whole words and concepts, it made writing simple, consistent, accessible and flexible.
1949 The Ergonomics Society
Despite centuries of fretting about the adverse impact of quill pens and spinning machines on their human operators, nascent ergonomists received no formal recognition for their expertise until the Industrial Revolution and that great scientific catalyst, modern warfare.
Production lines, head-up displays, buttons and dials, and a host of military advances had a huge impact on the effectiveness of the working and armed forces, contributing to the formal emergence of the discipline of ergonomics and The Ergonomics Society. Initially concerned with general equipment design and the effects of fatigue on performance, ergonomics remains the scientific underpinning of all modern human factors research, and the umbrella beneath which an increasingly diverse group of professionals shelter.
1950 The Turing Test
Can computers think? The Turing Test (after its creator, Alan Turing) pits a human interlocutor against another human, hidden from view, and a machine, also hidden. Chatting to each via a screen, the questioner must work out which is human, which machine. If this can't be done, the machine has demonstrated enough 'intelligence' to pass the Turing Test.
In any normal conversation, any machine will quickly fail, demonstrating just how complex 'intelligence' is, and how difficult it is to replicate in a machine. The Turing Test shows us the parameters we're dealing with in HCI and helps us recognise that the gap between 'smart' machines and human brains is still very, very large indeed.
Picture credit: Bletchley Park Museum
1956 'The Magical Number Seven, Plus or Minus Two'
This classic paper describes research into human short-term memory. Written by psychologist George Miller, it concluded that we are unable to process more than seven (give or take two) items of information at any given time (for example, we can't remember more than around seven unrelated items from a list read out to us once without 'chunking' — forming meaningful associations between the items).
The paper has been both decried and enhanced since then, but its influence on user-centred design has been huge. The number of items in drop-down menus; number of options to choose from during configurations; number of pages we can hold open; and many others — all recognise that it's our short term memory which has a limitation, not the technology. Information overload, anyone?
1970 The WIMP interface and desktop metaphor
The transition from working with hundreds of lines of dense code to working with a graphical interface comprising Windows, Icons, Menus and Pointers (WIMP) was revolutionary.
No longer was the computer a scary grey box processing bits and bytes: following the efforts of researchers at Xerox PARC, it became your very own digital desktop. Files were stored in cabinets which looked like little filing cabinets; selections were made by pointing and clicking; completed documents could be dragged from the desktop and tucked away into little yellow folders. Loved and hated, the desktop metaphor has been polished, refined and enhanced, but it’s never really gone away: a usability classic.
1990 The web browser and hypertext
The internet was quietly emerging from spotty teenagerhood into a rather shy young web by the time the first browser was placed gently on top of it. Hey presto, the sum of all human wisdom (and a lot more besides) coalesced into an easily accessible desktop resource.
The addition of a graphical browser with pages linked, not serially, but in three conceptual dimensions by hypertext, turned the web from obscure tool for nerds and military bigwigs into the world's library, meeting point, games room, university, village hall, den of vice — you name it. It stripped away the mysterious technical connotations and invited everyone to join in and transform the world as we know it — and boy, did they ever.
1993 ISO Standard 9241
The International Standards Organisation isn't guaranteed to make your heart skip a beat, but its work on defining, measuring and regulating the complex and subjective, so that we all know what we're talking about, has been highly influential across many industries.
ISO has dealt in-depth with usability, providing detailed guidance on usability methods, application of findings to particular technologies; a format for usability reports; what a web page should look like; design criteria for physical devices; and much more. Its definition of usability commonly appears on the first page of human-computer interaction lectures and presentations, and awareness of its standards can make the difference between getting the job, or not. ISO 9241, we salute you.
1979 Mobile phones and wireless connectivity
Connectivity on the move has added a whole new layer of freedom to our working, playing and communicating lives. On the surface, it's fun and convenient, but at a deeper level it's transforming the very fabric of society and making us think more deeply about the nature, importance and management of all our emotional, familial and professional ties.
Mobility means profound change, and nowhere is this expressed more iconically than in the iPhone, a mobile phone drawing on all of Apple's design and usability expertise to make mobility a positive choice, a lifestyle statement and an enriching experience. Lately, similar attention to mobile usability has helped the netbook class of devices gain a foothold, further extending the integration of technology into our daily lives, wherever we may be.
1990s Emotion research
Even today, the most powerful computer is basically a really, really fast adding machine. We're not. Human emotion, once regarded as a rather inconvenient distraction from the important business of thinking, reasoning and being logical, is receiving growing recognition for its central role in our capacity for thought, motivation, creativity, persistence, activism, achievement, pleasure — and buying power.
User experience research now encompasses the entirety of our human natures, studying, analysing and responding to our ability to feel happiness, irritation, desire, revulsion, curiosity, loneliness and every shade of feeling in between. It's not easy work, but it's work that will make the difference between technology we endure and technology we engage with.