The end of applications?

The end of applications?

Summary: Sometimes someone says something at a conference that really knocks me for a loop. Such was the case at the High Performance Computer Architecture Conference last year.

SHARE:
TOPICS: Hardware
28

Sometimes someone says something at a conference that really knocks me for a loop. Such was the case at the High Performance Computer Architecture Conference last year. In typical panel fashion, a group of us were each given a few minutes to state our position on the future of computer architecture.

The panelist were chosen to represent a broad spectrum of architectural views from the traditional (x86) to the more radical (Cell) along with a software viewpoint. ...it becomes harder and harder for developers to build, let alone imagine, applications with dramatically new capabilities. The hardware panelists more or less stuck to their respective party lines, but the software speaker said something that I won’t soon forget, “Since all of the interesting applications have been written, why is that you guys are still inventing new architectures? What IT managers want now is just lower cost hardware and easier to manage systems. That’s what you should be working on!”

Now I like a provocative panelist as much as anyone, but I just couldn’t swallow the line about the end of applications. I’m squarely in the camp that believes that the truly compelling computer applications have yet to be built.

At first I put the applications comment under the same heading as other famously wrong-headed thoughts about computing such as “only six electronic digital computers would be required to satisfy the computing needs of the entire United States” (Howard Aiken) and “there is no reason anyone would want a computer in their home” (Ken Olsen). The more I thought about it, however, the more I began to realize that it was easier than I first thought to reach conclusion that the era of new applications was over. There are at least three factors at work here.

First, virtually all the mundane clerical tasks of the 19th and 20th centuries are now done with computers. Today’s productivity suites, for example, are regularly criticized as bloatware reflecting the fact that developers continue to add features, while not adding to the fundamental utility of the toolset. Databases are enormously more useful than the filing cabinets and card catalogs they replaced, but new releases have less to do with new capabilities and more to do with scalability, manageability, and security.

Second, the human interface has not evolved much beyond what Chuck Thacker’s Alto personal computer and Alan Kay’s Smalltalk windows and browsers demonstrated some thirty years ago. While the fidelity of the graphics interface is much better, most of what we see today is just eye candy.

Third, computer hardware evolves at a rate that is largely governed by Moore’s Law. Ten or fifteen years ago, general purpose performance was improving almost at the same rate as the transistor budgets were increasing. In other words, processor performance doubled every 18 to 24 months just as the number of transistors in a square millimeter of die area doubled in that same time period. For a number of years, this behavior was known as Joy’s Law after Bill Joy of Sun Microsystems, one of the first people to observe the trend. Unfortunately, two-fold performance gains are no longer occurring every two years despite the fact that Moore’s Law continues to hold to that two-year cadence.

With much less than a 2x improvement in processor performance every two years, it becomes harder and harder for developers to build, let alone imagine, applications with dramatically new capabilities. Add to this the fact that other aspects of hardware performance are barely improving at all (e.g. disk latency) and you have plenty of reasons to believe that the applications party was over.

With software not showing much in the way of functional improvements and hardware gains slowing, it is not surprising that some people are willing to declare the end of applications. It also explains why the guidance to architects is to focus on reducing cost and improving security. Why would anyone think otherwise?

I suspect by now a good number of readers are more than anxious to point out that scripting languages, RSS feeds, mash-ups, wikis and so forth are, in fact, the new applications, but I would beg to differ. While most of the current Web technologies provide improvements in the way applications are built and information is shared, they do not represent fundamentally new uses or changes in the nature of the man-machine interface. If we are going to breakthrough to the next level of computing applications, we have to attack the problem at a deeper level and apply dramatically greater amounts of computing power than we have to date.

Just how I see us getting there is the topic for next time. Your thoughts and suggestions are, of course, most welcome. We’ve been on this plateau for too long a time already. I’m less concerned about how we get off of it than I am about how soon we do it.
 

Topic: Hardware

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

28 comments
Log in or register to join the discussion
  • Well, if it is the end of applications...

    it is abominations.
    jsaltz
  • You'd think we'd learn by now...

    You'd think we'd have learned by now that we're not good enough at predicting the future to be able to make sweeping "limiting" statements like this. Just like Bill Gates said that no one would ever need more than 640K of RAM, and the ones you've mentioned.

    Seems like anyone who makes a statement like that is just trying to stir up controversy and discussion (not a bad thing!), or is truly delusional. Or maybe we've finally found someone who can accurately predict the future! :-)

    Great post, Justin. Would love to see them more often!
    jabancroft
  • Especially as web-based, no-desktop apps have been in the works for years.

    Why all that costly support of techs who do more to make end-users, who otherwise wouldn't know what to go do with themselves, is just a waste of money. Get rid of the techs and most of the users still wouldn't know what to do; even something so dumbed down as a computer that a toddler could use it. (Judging by XP and Vista's interfaces, a toddler must've designed the GUI's appearance too...)

    Which is a pity; a tech does far more on the people-level than he'd do on a technical level.

    But, we're not people. We're organic machines; annoying costs that take away from that bottom line and nothing more.

    Oh well.
    HypnoToad
    • A clarification

      Without techs, the end-users would have a far more difficult time getting to a productive level when compared to having techs around.

      We are useful. But there's far more to life than the upfront sticker price. Only idiots go that route.
      HypnoToad
  • That's very funny....

    Sounds like the comments I was forced to listen to at a Java One conference about 4 years ago. I was in an elevator and some CTO wanna be/wacko tried to tell me..."All the problems have been solved. Its now just a matter of...."

    Fortunately I was able to ignore the rest of his conversation by thinking of something more pleasant...like an audit from the IRS...or perhaps it was oral surgery without pain killers.
    geek4hire_z
  • Singing the Body Electric

    We are at a threshold. The killer app is dead. The mildly
    sarcastic app is dying. We are being force fed bloatware and
    slapped for piracy. We are resentful when technology is intrusive
    and resentful when it fails to meet our expectations. So how is it
    that the future has never been brighter?

    Computers have in the past been sold to analog consumers. The
    guileless were dazzled by shelves stocked with a myriad of
    software. They were sold a new fangled modern convenience.
    They were also sold abundance itself. The days of goldrushware
    are over. Computers are now sold to consumers who use
    computers. We are jaded and exhausted by choice. We look for
    refinement, we look to participate. We choose authorship over
    consumption.

    We had presumed to be the architects of the computer. We
    designed hardware and software and marketed it as a clever
    invention. Fast forward to the present... computers now design
    us. We are acting under their influence, our expectations have
    been created by them. Our lives patterned around them. In a
    very short period of time, we have evolved. As computer tech
    permeates more aspects of our lives, we are encouraged to
    participate. Mashups, Network effects, 1467 myspace friends.
    We live increasingly in our minds. Enter the new hybridized killer
    application, it's called "human being", and it's still in beta.
    Harry Bardal
  • Thinking inside the box

    Advancing age has some advantages as you watch another moron make another statement that is doomed to be false.

    I remember computer people frowning at word processor apps - they'll never catch on. Scammers telling us thin clients would replace the desktop (Hi Murph ;-)), but the best one is the death of programming, that's been replayed at least 3 times over the years.

    The problem in trying to make predictions about the future is that you can't get there from here. All your predictions are based on today's tech and their particular directions and while it's fun to speculate, the future will be stranger than you imagine.

    I don't see any thought controlled 3D holographic apps yet, personal AI, recombinant viruses storing data in your brain, synthetic wetware, for god's sake doesn't that moron read science fiction?

    ;-)
    TonyMcS
  • I'm sure Aiken and Olsen thought hard about their statements, too

    Aiken and Olsen's statments look stupid now, but I bet they did just as much thinking about their statements as you have about yours. And they probably could write a document just as long as your article that would support their statements, and sound just as logical.

    If you think it's the end of applications, then go to a hospital has just switched from paper records to electronic records. Tell them that the application they're using is as good as it's going to get, there isn't any more area for innovation. They will either be scared or laugh. Scared because they're thinking "I'm left with this POS forever?" or laughing because they are smart enough to know how much more improvement there is left to go.

    Applications like these are just getting started. (Maybe you have a point when it comes to consumer applications--there's only so much more that can be done to Office.)
    PB_z
    • Elaboration vs invention

      An application can be made more useful and efficient. That's what I mean by elaboration. But a spreadsheet will remain a spreadsheet no matter how much more responsive to user needs it becomes.

      An invention is a new function as essential as a spreadsheet, something with demonstrable effect that the computer does not yet do.

      Let me speculate. New functionaliy will follow a new version of hardware. And that hardware will not have less functionality than a current desktop, but significantly more.

      The example of biological computing someone else gave already, referring to science fiction, is a not entirely far-fetched example.

      The hardware must find a new way to work of which software can take advantage. Software can do only what hardware permits it to do.


      Imagine what new software functionality will be available when you can see the Windows flag wave even before you open your eyes in the morning.

      Sorry. ;-) Couldn't resist.
      Anton Philidor
      • Not in my household Anton!

        "[B]Imagine what new software functionality will be available when you can see the Windows flag wave even before you open your eyes in the morning.[/B]"

        I see penguins when I open my eyes! I have them on my desk around my PC's and severs. ]:)
        Linux User 147560
    • Um... that's not what he said

      He didn't agree that it's the end of applications. He just said it'll take more computing power than we've applied so far to break through to a new level of application. If you apply significantly more computing power to the existing hospital functions you mention, many of which though not all are database-related, you'll get what someone called elaboration, not new applications. But he does see new applications coming: "I?m less concerned about how we get off of [the current plateau] than I am about how soon we do it."
      TJGeezer
    • Nothing new under the sun.

      Your example of hospitals actually scores a point for the other side, PB_z. In 1992 I was writing and selling electronic medical records software. That included imaging and all forms of statistics, clinical records and S.O.A.P. records. You couldn't sell it then for love or money... physicians loved their paper files. And I moved on to other venues rightly convinced that they had no vision.

      Now, 14 YEARS later, just because these people are finally adopting solutions that were invented long before doesn't mean that the solutions are new. Adoption isn't invention. What a marriage! They had no vision of the future; you have no vision of the past. There may be new classes of apps on the horizon, but this most assuredly isn't one of them.
      dave.leigh9
  • The End of Applications

    So who/what are we building Applications for ?
    How fast can 'you' process and respond to the data you receive ? Our software evolves slowly because WE are currently the show-stopper. But hold on to your hat, because in the blink of an eye (from a historical perspective), you will not be 'you' at all, and the statement that the 'end of applications' is imminent will have been the matra of a Australipithicine.
    rcmotts
    • you are correct sir!

      One slight exception,

      Australopithecine...use the 'o' version!

      I'm not one to talk, for sometimes I feel the urge to say "this is it!" or "that won't be beat!"...

      Some advice:
      Take a weekend off and go chat with the ederly. You laugh but ask them what they thought we'd run out of...or when the end of this or that would occur.

      My Great Granpa Mills (R.I.P) "Juni, I remember when one of those planes flew folks over my property."
      jdzil
  • Applications

    Our capability to design improved applications could come from any of the following sources:

    1. A breakthrough in the field of mathematics that became an applicable addition to computation theory.

    2. A breakthrough in physics, biology, or chemicals (roughly in that order).

    3. Refinements to existing computer science, programming techniques, and networking capabilities (again roughly in that order).

    The hardest problems, and most beneficial solutions lie in the first category; the most likely and exploitable opportunities lie in the third category.

    Economic theories strongly point to an extended period of consolidation (and genuine value extraction) from IT. This will not be an end to IT applications - it will actually be the era in which they provide start to provide high net benefits. It seems likely that applications will do by being around in higher numbers, with each completing an increasingly more specific role (or job). There is a great deal for the end user to be positive about here ? although those who are less than keen to share their lives with technology will have little to cheer about to be sure.

    Getting from the present plateau (to the next!) will be dependent on the emergence of a whole new type of technology. This is already happening, through the research being completed in cat. 2 areas. Unless we have major philosophical objections, we will absolutely continue to develop new types of application. Progress will continue to occur in a relatively orderly, cyclical fashion; our only ?worm-hole? is an advance in category 1, although our best bet may well be to use one set of advances in pursuit of the other.
    Peter Cowling
    • 0. Analysis Methodologies

      It seems to me that the chief barrier to innovation in software applications is that all of the "easy" problems have already been "solved."

      By "easy" I mean applications that either are (1) algorithmically simplistic enough for business domain experts to solve, or (2) that have simplistic enough problem domains for computer scientists to solve.

      All the remaining problems have too complex of domains to be solved in a rigorous manner by domain experts (because of insufficient mathematics/computer science knowledge) and to subtle to be solved by computer scientists (because of insufficient domain knowledge).

      Futhermore, the problem is worsened by "modern technology." Not only does the computer scientist have extreme difficulty communicating with the domain expert, but now our web of component and frameworks and other pieces of infrastructure are so complicated that they are problem domain unto themselves. As a result, even if the computer scientist manages to understand the problem domain well enough to solve it, he's lucky if he can find a software developer who even understands elementary calculus, much less more advanced computational mathematics.

      So, given that it's too hard to solve any new problems, and that the existing solutions to old problems are really both terrible and expensive, business focus on reducing cost and mild functional improvements.

      Futhermore, even when someone does manage to solve an interesting problem, it ends up being a work of genious that no one understands and therefore is extremely difficult to monetize.
      Erik Engbrecht
      • also perhaps remove the human

        I wonder if you ran genetic algorithms to solve engineering problems, that this could come up with solutions outside of the box (the box being the collective consciousness of humans).

        For example, define your inputs one side, outputs the other side, run Genetic Algorithms to get minimum gates, maximum speed.
        It would be tough or impossible to understand the solution, but I think we should make more use of this kind of technology.
        You could do the same with problem solving software. Sure you couldn't predict where it would fail, but you can't really do that with human software anyway.
        The difference would be for GA produced software, you'd only ever write the specification, no code.
        stevey_d
  • Shared thoughts

    The biggest draw back to the future of both applications and hardware I see is legacy support. We are basically building the same PC we were say 10 years ago. Oh to be sure the pieces are faster, but it's still the same machine. The obvious result of this is the applications that work well on them have stagnated.

    If we let out imaginations run wild, limited only to existing and available hardware, and said wipe the slate clean and design a PC you can be absolutely certain of two things, it would be faster by several magnitudes, and would be simpler to use.

    Yes it made sense to have different subsystems when the PC was first concieved, but I say that is no longer the case today. There is no reason a mother board populated by Cell CPUs (with space for more when needed) can't handle everything today. No video cards, no sound cards, no drive controlers, no port controlers, no seperate convoluted memory controlers, in fact none of the silliness we have come to know and hate. Of course there is no reason to make incremental jumps in bit operation. Just move directly to 256 bit CPUs and have done with it.

    While AMD and Intel muck about trying to add video to the CPU, a paradigm shift in thinking may just make the effort moot.
    No_Ax_to_Grind
    • Buyers and sellers

      Let's say that you ordered a company to "wipe the slate clean and design a pc".

      The first thing a sensible company would do is find out what users wanted. They're the buyers.

      Cheap. Efficient. Durable. Preserves investment in software.

      This next one is more questionable, I suppose. Runs certain existing applications better. Hardware adapted to the way specific software works.

      My conclusion is that hardware companies follow directions set by engineers. They may already be making products better than customers want.

      I still think that a cheap chip pressing no boundaries but reliable would be a money-maker for years to come.

      Has engineering not gone far enough, or too far?
      Anton Philidor
      • Yes and no...

        One could point out there was no great buyer demand for the first PCs. It was only when they were shown the possibilities they became popular.

        I do believe that we have reached a point to again start with a clean slate and turn the engineers loose.
        No_Ax_to_Grind