For Apple, an inflection point

For Apple, an inflection point

Summary: Thoughts on the news from yesterday's Worldwide Developers Conference.


How do you follow a legendary act? By making sure the audience doesn't remember what came before you.

No, I'm not talking about Apple co-founder Steve Jobs, whose death in 2011 pushed chief operating officer Tim Cook into the limelight. I am referring to the unprecedented run of success that the company enjoyed between 2001 and 2011, a rough approximation of the span of time between when Apple introduced the iPod portable music player that would make it a household name and the death of its spiritual leader Jobs.

In that period, the company entered and dominated (to the point of synonymity) the portable media player market, then radically altered the business model for its content; set a visual and functional benchmark for what would come to be known as the "smartphone," then commanded the segment across continents; created a mainstream market for the slate-style tablet computer, then dominated it; introduced a laptop computer with market-leading thinness, then popularized it; and finally, knit them all together with a single, cloud-based account system that made it difficult indeed to use an Apple product without being a click or tap away from one of its various content, software or services stores. Through it all, the company built and burnished its tony reputation.

Jobs' passing coincided with a moment when the company's strategy was in full bloom. The swell (and product pipeline) would carry the company for more than a year after Jobs' death.

But what happens when the momentum begins to recede? What then? 

Mark the end of a chapter, and start a new one.

During the keynote of its annual Worldwide Developers Conference yesterday, Apple took what I believe to be its first real, albeit careful, steps into its next phase: it introduced a new, non-feline naming scheme for its desktop operating system, OS X, and it unveiled a redesign of its mobile operating system, iOS. (It also announced other things; read the recap here.)

The milestones matter.

For OS X, Apple's new California-styled naming scheme for the next versions of its 13-year-old operating system ("Mavericks," named after the popular U.S. surfing spot, will be the first) will likely see the desktop computer through its final years of broad consumer popularity and into an era of narrower, more work-oriented use.

For iOS, the new version represents the company's first attempt at reinvention—and I use that term without implied exaggeration, because much is the same—rather than refinement. It is a deliberate departure, and "the biggest change to iOS since the introduction of the iPhone," Cook said several times during the presentation, doing his best to draw a line between what came before, and what comes next.


The aesthetic and functional changes to iOS 7 are a mixed bag, at best. ("Simply confusing," Josh Topolsky writes with some disgust at The Verge.) Change is always a hard pill to swallow, but in this case, the new operating system's idiosyncrasies seem to reflect chief executive Cook's October 2012 management decision to dismiss the contentious Scott Forstall, formerly senior vice president of iOS, and assign his duties to other senior executives.


"Without its Decider-in-Chief," Bloomberg Businessweek wrote in 2011, "the company must fundamentally change the way it works." That sentence was referring to Jobs, but you could say the same about Forstall. In both cases, Apple is finding its way through internal changes that ultimately impact how its products look and operate—chiefly, iOS. Consumers are left to feel for the seams.

"It's like getting an entirely new phone," SVP Craig Federighi said during his WWDC presentation.


So how do you deal with all that baggage—the polarizing people, the landmark products, the unprecedented business success? By throwing it out the window and reframing the conversation. Turn the seam into a deliberate design decision, so to speak.

Cook may never be able to fully step out of Jobs' shadow, but he has certainly tried over the last year and a half to establish a new style of leadership that discourages comparison. Apple's iOS may never be able to make a clean break from its Forstall-led past, but its designers and engineers certainly tried in its introduction yesterday to signal a new direction that jettisons his favored skeumorphic touches.

And with the company's stock price down 38 percent over three quarters, Apple's senior executive team will do its best in the next two quarters to show to investors that they should not look at the company's progress relative to the price peak it enjoyed in September 2012, but rather the valley it experienced in April 2013.

It's all relative. All that matters is perception.

Topic: Apple

Andrew Nusca

About Andrew Nusca

Andrew Nusca is a former writer-editor for ZDNet and contributor to CNET. During his tenure, he was the editor of SmartPlanet, ZDNet's sister site about innovation.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Impressed.

    I watched the entire keynote yesterday, and was impressed with the presenters, as well as what they presented. A few less "incredible"s from Tim Cook would have been nice, but overall, I liked the tone of his presentation. And Craig Federighi did a bang-up job with the software presentations. I can see him becoming the next Apple CEO. Phil Schiller's "my ass" comment about Apple's ability to innovate may not have been PC, but it was "incredibly" appropriate!
    • Federighi is a gem

      Easily the most talented presenter of that crew.
    • Good..but..

      The presentations were all well done. I would have to say I was underwhelmed. Guess I'm used to Apple producing more or making an even bigger splash then what is expected. Schiller's "my ass" comment was laughable. Apple hasn't innovated in years and continues that trend. The Mac Pro isn't "innovation", its simply an enhanced version of what they already have, re-styled and containing the latest hardware advances (Memory, CPU, etc.) I wouldn't exactly call this "innovation". I enjoyed the Mac Air briefing and OSX Mavericks, but iOS7 left me empty, just not much there IMHO.
      • But

        I guess you could make the argument that Apple has never innovated. At least not in the same league as Microsoft, Google, and Samsung.
      • Re: The Mac Pro isn't "innovation"

        Funny you say that.

        It has nothing to do with "what they have" in the previous Mac Pro designs. It is a completely new computer design, and about the only thing in common it has with the old one is that both use Intel CPUs, that happen to be called "Xeon" (we know, they don't have much in common either).

        The key difference between the old and the new design is the adoption of the "peripherals out of the case" concept, that was pioneered with the iMac and the Mac Mini before that. All the MacBook Air and recent MacBook Pro Retina as well.

        Apple more or less introduced the world to a LEGO-style of computing and if this is not innovation, what is? Innovation and invention are different things, you know.
        Now, LEGO-style is available to the high end desktops as well.
  • the inflexion point makes apple tilt

    and slide into the trash bin.
    LlNUX Geek
  • Guess I have a different thought when I see the term "mavericks"...

    I usually think of unbranded cattle, free-roaming the open grasslands...or a lone dissenter going on their own trail. Gives a different meaning than a "popular U.S. surfing spot", perhaps more fitting since Tim Cook is trying to dig out from underneath Steve Jobs shadow.
    • I'm sure they asked themselves

      'What would a maverick do in this situation? And then, y'know, they did that.
    • Not fond of that name for OS X

      I guess I will need a desktop background of Tina Fey as Sarah Palin winking when it comes out.
      • Oh, just get James Garner!

        From the old television show, use being political about everything!
  • One word...

    Thoughts on the news from yesterday's Worldwide Developers Conference.

  • All franchises have to end someday!

    The longest movie franchise was not made by Spielberg though he started the franchise movie industry (atleast from movie production side). He started it with Jaws in 1975 but the biggest box-office movie franchise of all time is the Harry Potter series from Warner Brothers studios ($9 billion in all-time revenues).

    Apple is like Harry Potter. It has to come to an end at some point of time. It is also Spielberg but the decline of Apple has started.

    The innovation differential is lessening with every new release from Apple for OS X, iOS and iPhone/iPad hardware. And so will Apple decline too.

    I predict Apple stock at $200-$250 by end of 2014 (Dec 2014). And it will reach $100-$150 by end of 2015.

    By 2014, Apple will be as big as Google or Microsoft in marketcap. And it will be as big as Sony by 2017 (marketcap $50 billion or so).
    • Please explain your predictions.

      AAPL earned over $44/share last year. It has over $40/share in cash. Even if its earnings don't grow at all, by the end of 2015 it will have over $50/share in cash. Combined with a continued $44/share earnings, you are claiming that Apple, as a company is worth about $2 - $52/share.

      I hope you don't invest your own money, and I certainly hope you don't advise others, as you obviously have no idea what you're talking about.
      • Re-read

        Apple are currently showing earnings of $41.9 per share which is 4.5% lower than your $44/share. If their earnings continue to drop 4-5% per year then these predictions are more than likely.

        Apple bears aren't assuming earnings will stay consistent, they are assuming that earnings will drop. There's no reason that earnings cannot drop just as quickly as they rose - witness, as we have all seen, Nokia, Blackberry, etc.

        Apple are currently trading on P/E ratio of about 10, which is low compared to both the DOW and NASDAQ averages. This indicates that forward earnings are expected to be lower than peers.

        AAPL indeed has strong present earnings but if it were a good buy then more people would be buying.
        Kirk Broadhurst
  • I was impressed with the keynote presentation.

    For the first time, I didn't miss having Steve Jobs on stage. It truly was the beginning of a new era at Apple.
  • Yet Another Way To "Multitask" In IOS

    How many ways does this make now, 4 or 5?

    Why don't they stop beating about the bush and just add standard POSIX-style threads and processes, as supported by the Linux kernel and built into Android? Their developers will thank them for it.
    • The reason might be to maximize battery charge duration on a smartphone

      There are all sorts of methods to multitask software. But not all those methods make battery charge duration on a mobile device a priority.
    • Ummm

      They're not Android, they shouldn't just do what Android does if they think they can do it better.
      Michael Alan Goff
    • Re: just add standard POSIX-style threads

      Could it ever occur to you, that by virtue of being certified UNIX OS, iOS in fact does employ POSIX-style multitasking (which is not only threads).

      Apple deliberately tuned the multitasking the way it was until now, so that the applications can perform well on the puny CPUs used in mobile devices. As performance and battery efficiency improved, Apple have let more and more of the multitasking be accessible to user processes.

      In short, Apple don't have to add anything: they had it from the start.
      • Re: iOS in fact does employ POSIX-style multitasking

        Sort of. Unfortunately, it's built on the Mach microkernel, which means that threads are very expensive to create compared to, say, Linux. So on battery power, IOS would be even more handicapped. This is why Apple won't give developers access to full multitasking like Android does, instead they keep coming up with new ersatz flavours of almost-but-not-quite multitasking.