Apple iOS 7: Removing the obstacle of user interface

Apple iOS 7: Removing the obstacle of user interface

Summary: Some have accused Apple of copying the user experience of its competitors for the latest version of its mobile OS. Some are complaining that the changes are too drastic, or have not gone far enough. The reality is that the future of computing has one ultimate and common goal, which is to eliminate the user interface altogether.

SHARE:

There's been a lot of talk about the visual and user experience changes in iOS 7, Apple's flagship mobile operating system that powers the iPhone, the iPod touch, and the iPad products.

My ZDNet colleague Steven J Vaughn-Nichols refers to it as a "Mashup" of everyone else's mobile operating system, Android in particular.

Our resident mobile software consultant Matthew Baxter-Reynolds believes this is an exercise of whether or not Apple should have "moved the cheese," and how much it is willing to risk alienating its end users with changes in order to advance its products.

And our editor-in-chief, Larry Dignan, would like us all to stop complaining about user interface changes and just move on.

brain-computer-interface-ios7
(Image: CBS Interactive/ZDNet)

What is the common thread in all of this? Clearly, the nature of changing user interfaces and dealing with the issue of human-device interaction is not just an Apple problem. This is a problem that every single device manufacturer and software developer has to deal with now that we are moving into an age of ubiquitous computing.

And it's not so much as a "post-PC" problem but a "Where do we go from here?" problem.

As an industry, we don't have a detailed plan of the steps that are required, and what changes are needed to get us to the next phase in human-to-device interaction. But we have a very good idea of what the ultimate destination, or rather the desired destination, is.

Read this

Dare Apple move the cheese with iOS 7?

Dare Apple move the cheese with iOS 7?

Innovation is hard, especially with an established product and a satisfied customer base. Can Apple do anything radical with iOS 7? Do they even need to?

Look no further than the works of popular science fiction if there was any doubt about what that destination actually looks like.

For example, in Star Trek, human beings interact directly with computers and devices primarily without the use of user interfaces at all. Our favorite characters use voice recognition and work with artificial intelligences all the time to interact with information systems. 

The "Communicator" has no UI; you simply speak to it as an extension of the Enterprise main computer system. Taken even further, there are beings depicted on the show and in the movies (such as the "Borg") that interface directly with computers by linking them with their brains.

This is not just a theme within Star Trek, but also a common theme in many other forms of SF literature, such as with William Gibson's Neuromancer series of novels published in the mid-1980s (and his short stories that preceded it, such as Johnny Mnemonic and Burning Chrome), and was further popularized in movies like The Matrix.

If you want to go back even further than the 1980s, there are endless examples in the works of the Grand Masters such as Asimov, Clarke, Heinlein, and Silverberg, just to name a few. And we shouldn't forget Philip K. Dick, either.

The ultimate goal of directly interfacing with computers or the creation of a Man-Machine intelligence has been termed the "Singularity" within the context of a prevailing theory developed by many notable computer scientists, mathematicians and theoretical physicists such as Ray Kurzweil and Michio Kaku.

Such a symbiotic existence for humanity is not expected to happen for hundreds or perhaps thousands of years.

In the interim, using conventional technology, raw consumption of information in the form of services, or data streams through applications such as Twitter, Vine, Instagram, and Facebook is more of an immediate goal. The notion of a "lifestream" as a replacement of the current web paradigm has also been introduced by Yale University computer science professor David Gelernter.

I fundamentally disagree that consumption of lifestreams will replace the web, at least in the intermediate future.

I also have noted my reservations for what a stream-oriented society will do to humanity as a whole, particularly as it relates to our youngest generation. But that's not what this article is about.

To approach singularity, or any type of direct human to computer interface, we have to remove a number of obstacles in order to have the most efficient presentation of information.

Ultimately, the purpose of computers and devices is presentation and facilitating communication of information to and from the end user.

Human beings are slow, computers are fast.

The primary obstacle bottlenecking the interaction of the human being to the computer is the user interface. Singularity is ultimately the complete elimination of user interface.

In order to remove those bottlenecks, user interfaces have to become much more simplified. So it's no surprise that all of these mobile and even our desktop operating systems are moving in this direction, as they are all attempting to solve the same problems.

To say that Apple has copied this is a bit much. The cognoscenti of Silicon Valley are all very much part of the same social group.

Smartphones and tablets have minimal real estate, and ultimately, you want to present information to the user as quickly and as efficiently as possible, and to have as few obstacles as possible for getting to that information.

When I first saw iOS 7, my first thoughts were not "Hey, this looks like Android." Actually, my initial thoughts were "Hey, this looks kind of Google-ey."

Now, you might not see a distinction between Google and Android's user experience, but to me, they are different.

Google has always had minimalistic user experience designs for their software, whether it has been in its own search page or within products like Gmail, even before Android was purchased by the company in 2005. So you could say that Android has inherited Google's tendencies toward highly efficient and minimalized UX.

Now, to say that Apple has copied this is a bit much. The cognoscenti of Silicon Valley are all very much part of the same social group.

And if you want to extend that even further out to points north of San Francisco, I think it is safe to say that you can include those folks as well. If anything, they decided to throw their babies out with the bathwater, and "move the cheese" significantly first in the name of advancing their products.

But hey, don't listen to me, I'm biased, I'm one of those people.

Jony Ive and Marissa Mayer are known to be close friends, and, as we all know, before she became CEO of Yahoo, Marissa Mayer's previous job was head of User Experience at Google. So to say that these folks don't all sit down for coffee (or dinner) on occasion to discuss wide-ranging subjects such as typography and UX minimalism is naive at best.

To imply that they all "copy" is much like saying that well-known artists during any particular stylistic period have copied each other. It's asinine. They belong to the same school of UX design, and thus that ideology is reflected in the end products of each of these companies.

That school is employing clean typography, making the most use of white space, and reducing UI element complexity to eliminate barriers to consuming information and optimizing devices for new forms of human and device interaction such as touch, speech, and gesture/facial/motion recognition. This is the new norm, no matter which of the powers that be that we are talking about.

Is the ultimate goal of computing to eliminate the user interface? Talk back and let me know.

Topics: Mobility, Apple, Emerging Tech, iOS, Smartphones, Tablets

About

Jason Perlow, Sr. Technology Editor at ZDNet, is a technologist with over two decades of experience integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies. Jason is currently a Partner Technology Strategist with Microsoft Corp. His expressed views do not necessarily represent those of his employer.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

116 comments
Log in or register to join the discussion
  • Bottom line - they did include ideas from all over the place

    That is something you CAN'T deny, whichever spin you try to put on it. From an initial stubborn reluctance to alter their interface, at least they realised that their interface has become stale and need a refresh. And at least now they realised that customer wants some form of customisation.

    All these is good and drawing inspiration, even if it is direct inspiration from other better product is nothing to be ashame of, as long as it integrates well with your own system. My and many other's only beef is the pure hypocrisy of iFruity in getting on their high horse about all the copying (O right ... the rounded rectangles and the swipe to unlock? how did these get patented in the first place?)

    As for your assertion that there is no interface in the future, I disagree. There will always be a need for interface between human and machines, cause the language is different. The interface will always be there between the user and the machine, whether if you can see it or not.
    poppypig
    • Copying the copiers?

      This "copying" is a very circular relationship, as iPhones existed before Android, which was definitely derivative. It's easily arguable that the iPhone was derivative of Palm Treos. It's a stair step relationship. Each platform takes what exists, then adapts and improves it. That's exactly the way it should work. We all benefit from this because overall everything improves over time.

      That said, some of these iOS 7 changes seem more like steps backward for iOS. There are also inconsistencies where some of the interfaces are white on black instead of black on white. It's pretty jarring when you hit one of those screens and they are much less legible. I would also debate the use of such thin fonts because they tend to "halo" on a white background, causing eyestrain.

      In my mind, these changes are mostly change for the sake of change. If it doesn't increase my ability to get things done quicker or more easily, it's not worth the resources expended creating it. I guess they needed to give their artists some busy work, so they had them flatten all the icons. I wonder, what advantage did I gain by my icons being flatter looking?
      BillDem
      • obscurity and failure to recognize failure

        White on black is far better for light-emitting displays than the asinine inverse color scheme that was popularized during the "desktop publishing" craze of the late '80s/early '90s. Back then the thought was that we should make the screen an analogy for a piece of paper. That fails because paper doesn't EMIT light. Reading little black characters off the surface of a light bulb all day is stupid, but that's exactly what Apple forces its users to do in its OSes. At least Microsoft lets you set up a proper color scheme (although doing so reveals major UI defects in its products).

        And while the pathetic "skeuomorphic" designs were huge step backward for usability and professionalism, this "flat" BS is often an excuse for hidden controls and defective UI. If you don't indicate what's a control, your GUI fails. Is the user expected to sweep the entire screen, looking for hidden goodies under every label or graphical tidbit? SHOW the controls and DISTINGUISH them from static items. Period.
        dgurney
        • I completely agree with dgurney

          The so-called Silicon Valley cognoscenti all have one thing in common - they have put form before function.

          I HATE hidden hyperlinks and controls. I'm not using a device for the joy of discovery - I want to see where the controls are with ONE glance at a screen, and not waste my time poking around at every rectangular shape. That could perhaps be a reason why many loyal Windows 7 users are reluctant to switch to Windows 8.

          I'm not advocating "back to skeumorphism" - I believe digital design has a language of its own that has become intuitive - for example a button that looks like a button - and we need to retain some of those to ensure that long-time users are not left out from new versions of older products.
          jaykayess
        • Indeed

          Been using pine green backgrounds and a light green-blue for font for a while now, much easier on the eyes for long reading sessions.
          Max™‮‮
    • There will always be a user interface

      Agreed. Speaking to a computer will still be a user interface, and it's one that doesn't work very well at this time, so I'm not holding my breath. Even a thought-controlled computer is going to require a user interface — and it's not something I'm looking forward to.

      I also share a hope with you that Apple will cease claiming ownership of basic ideas that are elemental in the area of design. There are a finite number of ways to combine elements to accomplish the purposes of a user interface, and a degree of commonality is beneficial to the end user.

      I own a Honda Fit; on a trip recently, I rented a Chevy Cruze. The steering wheel, brakes, accelerator, and a host of other components of the automotive user interface were generally familiar, although there were obvious differences between GM and Honda design.
      S_Deemer
      • You summarize

        exactly what was going through my mind as I was reading this. You will still interface with the computer. In sci fi you talked to it or you had a droid put a round thing into a hole in the wall and spun it around and then it beeped at you and then a Gold droid told what those beeps meant. BUT it still was an interface.
        apoteke
      • Retro is better

        I prefer physical keyboard over flat keys because they feel like buttons you can actually press. Flat is almost like typing on a table and that doesn't work very well either. You can't possibly have a computer do the work for you unless it is connected to your brain by nerve sensor/chip. Right now the computer that will react to the human command is slow. We'd have to work on what I like to call "acceleration", both in the evolution of the human and technology while working together. Where the mind of the human can also transcend the parameters of technology, and this is often the case. Computers may be fast but a human's will can override the computer. This is sorta the type of technology I'm looking forward to, but probably won't come for another 2-5 decades.

        Going back to interface, I feel our tech is at its peak because people are too focused on smartphones and should be focusing on other technology. PC's on the other hand have unlimited potential even if it requires more space in different ways.
        Tobias Lu Alastor Phoenix
    • Actually, anyone who has even a modicum of knowledge about Ive

      and his zen-philosophy over Apple hardware design could see exactly what an OS designed by Ive would look like. That Apple's UI would become more zen-like when Ive took over the design team was a no-brainer.
      baggins_z
    • You hit the nail on the head with this.

      It's not so much the fact that apple copied anyone, or was inspired by someone else's ideas, this is expected, but like you mentioned, it's the whole hypocrisy aspect that it's okay for apple to do this and no one else can.

      Then you get all the apologies and excuses for this and that, which whatever, it's fine, I don't mind if they borrow ideas from someone else, but if another company does something that may seem inspired by apple, STFU, period.
      SteveWojo
  • Star Trek style voice interface

    It does strike me that the voice commands to computers in Star Trek feel a lot like database queries, which really aren't programming. I kind of suspect that the real programmers are off in a back room using keyboards and monitors (the written word is so much more precise than is the spoken word).

    The Neuromancer scenario may indeed be where we're going (perhaps Google Glass is a step in that direction), but I'm not altogether sure it would be a good thing (a lot of us really don't spend enough time in the real world).
    John L. Ries
    • Don't watch startrek much do you?

      I'm a Trekie from way back and yes they do use voice for most thinks, they do have manual interface's and regularly fly their ships via "shock horror" touch screen consoles. Building new hardware is done with a combination voice, visual inputs. In Star Trek "The Voyage Home" they go back into the past and Scotty has to use a dodgy old PC and Scotty states "Keyboard. How quaint"
      To cut along story short, no they don't use keyboards!
      martin_js
      • that's a classic moment

        Yes I remembered that scene. And Scotty even tried speaking to the mouse !

        Btw the old PC was an original black and white Macintosh. Very cutting edge during that era !
        ThinkFairer8
        • What Program?

          What I always wondered about things like Scottie using the computer to design a new product, was: What program did they use? Even some of the best design programs available today wouldn't be able to properly come up with the such elaborate designs. Yet, these people just sit down, start typing, and the product comes out perfectly designed.
          Webminotaur
      • Voice was used a minority of the time.

        In fact, the Star Trek computer voice interface was used far less often than the touch displays. Everyone is seen constantly sitting or standing at touch displays busily performing a presumably wide variety of tasks. The voice interface is only used occasionally, and for pretty specific tasks. Imagine if everyone were always talking to the computer non-stop. The bridge would sound like a busy restaurant. Nobody would get anything done because of the distraction.
        BillDem
    • SQL not really programming? Huh?

      Let me guess, @JohnLRies, liberal arts major. If SQL weren't a language then anyone should be able to create massive databases and pull out all sorts of interesting nuggets. LOL!

      Back to the story: and that's the shame in the iOS7 announcement; we Devs were hoping to get some of the keys (API's) into Siri - nada. An iPhone user should be able to stay 'open
      beau parisi
      • I was a CS major, actually

        But I appear to have a procedural bias, which caused me to make a rash statement. Point being that it would be very difficult to do substantial programming with a voice interface alone. Yes, some amazing things came out of the Holodeck using its voice interface, but the specifications were very high level.
        John L. Ries
      • SQL

        SQL is a query language, not a true "programming language" as it is generally understood. What do you think the "Q" stands for? Programming languages are procedural in nature. A programming language may include some query functions, or even directly access SQL functionality, but query languages are far more task specific and more limited than a "real" programming language. For example, SQL was written in a "real" programming language.
        BillDem
    • No keyboards?

      Maybe not, but then why was Scotty so fast at typing when he saw that (and the mouse) were the only input methods available? He certainly didn't "hunt and peck" like a novice or someone with only school knowledge would do.
      Max Peck
      • Because it was a MOVIE...

        with actors from the 20th century who use keyboards and mice, not a documentary from the 23rd century.
        rbjazz