What Mactel means for Apple

What Mactel means for Apple

Summary: the right to watch Microsoft Word run more than twice as fast on the other guy's Windows/XP machine isn't going to sell a lot of MacBooks or iMacs

SHARE:
TOPICS: Apple
72
When Apple first announced its switch to "Intel Inside" most people seemed to rationalise the change on the grounds that users fundamentally don't care what's inside as long as the product meets their needs, that IBM wasn't meeting Apple's supply requirements on either volume or performance, and that Intel would.

In addition, most analysts glossed over the reversion to 32bit CPUs brought on by the switch to make one or more of the following supportive arguments:

 

  1. that the switch would lead to faster, cheaper, systems;
  2. that the switch would make Apple more "mainstream" and thus lead to wider adoption;
  3. that the switch would lower costs for software developers and thus increase the range of software for the Mac; and/or,
  4. that the loss of brand differentiation consequent to making the Mac a more mainstream product on the hardware side could be offset by re-focusing marketing on software and "user experience."

That was then, today Apple is shipping an Intel based iMac and is taking orders on an Intel based laptop. Unfortunately what is most obvious about these two new Macs is that they contradict expectations built up around the operation of Mooore's law by more or less matching, instead of significantly exceeding, The right to watch Microsoft Word run more than twice as fast on the other guy's Windows/XP machine isn't going to sell a lot of MacBooks or iMacs. the products they replace and costing exactly the same as, instead of being cheaper than, their predecessors.

Look at Apple's bill of materials on these products and the reason is obvious: the low end Core Duo CPU from Intel costs Apple more than twice what it paid IBM for a high end G5: about $265 for the 1.83Ghz Core Duo versus an estimated $104 and $78 respectively for the 2Ghz G5 and the 1.67Ghz MPC7447A (G4).

That cost increase on the Intel CPU shows up in the new products in the absence of improved functionality -particularly with respect to the screen, media drives, ports, and power management- and in the absence of price cuts. Indeed the current iMac and the forthcoming MacBook are the first new Macintosh series ever released not to have a price/performance advantage, when purchased as complete systems, over their Wintel competitors.

Look at the resulting product weakness from an Apple shareholder perspective and this might not seem to matter because Apple's profits and growth are being driven by the entertainment products, notably the pods, the music store, and the forthcoming video service; not the computer business. Unfortunately that's a very shallow perspective because sales in those areas are ultimately driven by the halo effect cast by the Apple brand -and that's driven by widespread perception of the Macintosh as a premium product.

It's generally true that Macintosh users genuinely don't care what's under the hood, they just want the product to meet their needs. Those needs, however, aren't limited to running their software and extend, instead, into the realm of self affirmation.

Fit perceptions of the Mac to Maslow's hierarchy of needs and what you see is that Mac user needs only start with the basic requirement that the machine process their work, and run their applications, at a reasonable rate. Beyond that, it is product differentiation as a premium brand based on better design, better performance, and more comprehensive functionality that drives customer self-affirmation and thus contributes to the community membership, fulfillment, and self actualisation elements at the top of the hierarchy.

Fundamentally it's the Apple brand, not intrinsic value, that sells the stuff that makes the money.

With that in mind it's possible to look at what "Intel Inside" really means for Apple.

First, the reversion to 32bit CPUs isn't much of an issue for the laptops, iMac, and the entertainment products. It does, however, knock out the key business lines: the PowerMac and X-Serves, and therefore cripple Apple's drive to maintain its market share in the high end publishing, photography, and video processing businesses. Fundamentally what's going on with those lines is that each time Intel announces further delays in getting lower power, 64bit, CPUs or integrated multi-core processors out the door in volume, Apple's options for this business line narrow and its credibility among key customer groups driving widespread downstream adoption decreases.

Secondly, the promised speed increases simply aren't there. Many reviews have now been done of the Core Duo based products, and the results are virtually unanimous: on applications built specifically for the x86 architecture the dual core 2.Ghz Intel machine is in the range of 10 to 30% faster than the G5 it replaces while producing significantly less than 50% of the G5's throughput on key user applications like Photoshop that have yet to be ported back to the x86 world.

Look at an average real world usage mix and a new iMac is considerably slower than an old one. Users will, of course, understand intellectually why that is, but the right to watch Microsoft Word run more than twice as fast on the other guy's Windows/XP machine isn't going to sell a lot of MacBooks or iMacs.

Third: the volume isn't there. The proposed MacViiv apparently had to be postponed because the CPU isn't yet available (and may be cancelled outright for performance and cost reasons); delays on the Macbook are traceable to Intel supply problems, and the next generation "Merom" and "Conroe" chips Apple needs have just been delayed again.

Fourth: the price decreases aren't there. On the contrary, Intel CPUs cost considerably more and that gap has to be filled from reductions in product quality, in initial configuration, in long term hardware support, and in Apple's plant gate margins.

Fifth: the attempt to shift the effort to build brand value from the total hardware/software package to just the software experience has already been derailed.

The problem is simple: because Apple can't drop the open source OS underlying MacOS X, they can't stop people from running the combined product for x86 on non Apple x86 hardware. For different reasons they won't be able to stop people running Windows/XP on Apple's x86 hardware either. Taken together these realities turn Apple's hardware into a commodity and force Apple into direct competition with Microsoft exactly where Microsoft is strongest and Apple weakest: on the user eXPerience in its home grown entertainment applications.

Sixth: there is a potential problem with the trustworthiness of the MacOS X on x86 that has yet to appear but could be devastating to Apple. The problem is this: it's a lot harder to exploit a code vulnerability like a buffer overflow or heap offset on PPC than on x86. Thus vulnerability announcements for Apple have rarely been accompanied by exploits and Mac users have generally considered the whole PC security mess a Wintel problem - and thus a key dimension on which their choice is smarter.

Unfortunately that may change with the move to x86: there are lots of people in the security arena who are highly motivated to attack Apple, and the x86 makes the second half of the process: exploiting a vulnerability once found, much easier than it used to be.

Bottom line? "Intel Inside" cheapens the brand, weakens the halo effect supporting Apple's highly profitable entertainment products, raises Apple's costs, results in reduced overall performance, and limits Apple's ability to differentiate its products.

So what can be done? Stay tuned, tomorrow's blog is about turning this disaster into opportunity.

Topic: Apple

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

72 comments
Log in or register to join the discussion
  • Be careful what you wish for

    Many macfanboys cried for Apple to make the switch to x86. They THOUGHT those same things that you outlined - but mostly they wanted a less expensive mac. This MIGHT have happened had Apple gone to AMD or VIA for its chips, but Nooooo they HAD to go with the "leader" (read most expensive). Unlike AMDs pledge to make all of its chips 64bit, InHell created this Duo in 32 bit - isn't that going "against the market"?

    I wonder about Pixar and Disney. Ray tracing activities with huge graphics files REQUIRE 64bit CPUs - and there is where Pixar had an advantage over competitors. I wonder how thrilled those Pixar programmers will be with their new macs . . .
    Roger Ramjet
    • Roger, question

      Are they going to be thrilled, are they going to commit suicide or are they going to buy/use their own PC or Mac?
      Arnout Groen
    • Intel won because of the Pemtium-M...

      It's a great performing chip with low power/heat.

      Yes, AMD would be nice, and there's nothing preventing Apple from using AMD, but AMD has to catch up to the Pentium-M first.

      Desktops are another story. IMHO former-G5 format machines should have AMD Athlon-64 X2's in them, period.
      BitTwiddler
      • Another thing...

        Apple needed a COMPLETE partner in this transition.

        AMD does not make chipsets (which I think is a bad idea as their chipsets have always been good).

        Intel has the whole package, so Intel wins.

        Apple didn't have to re-invent the wheel. Intel had already done it.

        And thank God Apple didn't go with VIA chipsets. I hate those bloody things.
        BitTwiddler
        • Yeah, I have to agree about Via...

          "And thank God Apple didn't go with VIA chipsets. I hate those bloody things."

          I have always thought of Apple/Mac as a rich people's higher end PC. Via chips are for a kids homework junker. >:-I
          Nix_0S_Fan
    • Roger, your post is borderline ....FUD

      First of all, ray tracing has never "required 64 bit CPU's". For the most part, 64 bit systems are just now becoming mainstream, and that does include all those professional graphic artists.

      I know this with ABSOLUTE certainty: My parents' neighbor was an award-winning animator that worked with several studios (Saz). His system, altough souped up, was a regular 32 bit system albeit with some pretty expensive graphics cards. The final "print" is done on dedicated ray tracing systems. Now, I don't know for sure about this, but I suspect that those are probably a bit more exotic than plain PC chips in a box. They probably are closer to Crays than Wintels. So again, the bit depth has nothing to do with it. The only thing that matters is the raw computing power - how many MIPS and MFLOPS the darn thing can spit out. Bit depth is not necessarily going to help you there, and in fact, it may slow you down depending on how it is utilized. For example, if your assessment were true, then all the top-of-the-line graphics cards would be 64 bit. But they're not. They're all 32 bit. Why? Because there's only so many colors that we need to generate, and 32 bits of colors is already waayyy more than the human eye can discern anyway. Did you know that a plain old 24 bit card can create 255 shades of grey tones, but that the human eye can only differentiate between about 50-60? So why on earth would anyone want a 64 bit card (maybe cause they're suckers with too much money in their wallet?)?

      With CPUs, the extra 32 bits of a 64 bit system buy you more bandwidth, but ONLY for 32 bit applications. If you now convert all those 1's and 0's from 32 bit to 64 bits, then only half the instructions are passed into the system, and it takes twice as long to execute the code (approximately). Not to mention the fact that eventually, the 64 bit streams must be crunched down into 32 bits along all kinds of other channels (say, USB, or IDE, or Firewire, or your graphics card and system memory). So... to make a long story short, it appears that you have either (1) bought into the 64bit hype or (2) are selling the 64bit hype or (3) bought it, chewed on it, and are now selling it.

      The reason Apple went with Intel was the reason they switched in the first place: Power consumption. Intel has that problem solved a heck of a lot better than AMD has. Just look at laptop battery lives. Compare similarly rated systems with Intel chips vs. AMD chips. The Intels run at LEAST twice as long. And that's not because the battery is any bigger. They simply consume a fraction of the power that other systems consume, and yes, that includes chips from IBM and from AMD.
      rock06r
      • So is yours

        "Power consumption. Intel has that problem solved a heck of a lot better than AMD has. Just look at laptop battery lives. Compare similarly rated systems with Intel chips vs. AMD chips. The Intels run at LEAST twice as long. And that's not because the battery is any bigger. They simply consume a fraction of the power that other systems consume, and yes, that includes chips from IBM and from AMD."

        Bzzt, wrong. Head over to Anandtech and see the AMD Vs Intel laptops. You'll see that the latest generation AMD mobile CPU's it's more like 10 minutes longer for Intel.

        AMD has really turned up (or down) the heat in this race in the past year or so.
        ITGuy04
      • Bit width applies to RAM access, not graphics

        32bits vs 36 vs 64 is an issue about memory access - word width for for the address. i.e. 32bits = 2^^31- 1 (or 32 if patch 0). Roger's point therefore is that you need 64bit ram addressing to do ray tracing fast -because you do it in main memory.

        This has nothing to do with graphics...although Compaq and a few others did, some years ago, try to advertise their PCs as 128Bit machines because their graphics controllers allowed parallel access on four 32 bit paths. Theirs was a marketing lie, yours a mistake.

        The fastest commercial ray tracing software outside the super computer world, by the way,
        uses a Sun 890 "workstation" ($200k!).
        murph_z
        • 64 bit

          Certain animation studios USED to use (64 bit) SGI machines to do the design and raytracing. Then they used 32-bit Linux boxes to do the raytracing and 64bit macs to do the design. NOW there ARE no 64bit macs - and using 32bits for design means that your graphics files are limited to 2Gb-4Gb. This is not an issue for simple animation, but when you want to add more and more data (for special effects), then those files GROW.
          Roger Ramjet
    • Who are these macfanboys of which you speak?

      ---Many macfanboys cried for Apple to make the switch to
      x86---

      Where'd you get that idea? I can't think of a single one. In fact, I
      remember most ZDNet Mac Users totally discounting the idea
      when the first rumors came out. I do remember lots and lots of
      Windows fanboys begging Apple to release an x86 version of
      OSX, and I do remember seeing a lot of frustration with IBM over
      their lack of progress with chips, but I don't remember ever
      seeing much crying for a switch to Intel.
      tic swayback
  • Don't understand the doom & gloom of this article

    This article seems to paint a fairly bleak picture for Apple if it
    were all true. But the missing points are that the main reason of
    the switch was not cost but the roadmap of Intel over IBM. If you
    listen to last years mac expo you will here the performance per
    watt banded about a lot as this was a key factor in making the
    transition. You may also remember that one of the most
    complex apps, Mathematica, was shown running native on an
    Intel mac. From what I remember this needed 64-bit mode.
    Don't know how it's done but I remember reading it.

    The 32/64 debate is an interesting one as no one has really
    shown any real world benefits. That is why MS have been so slow
    to release a 64-bit version of windows to the general public. It
    just doesn't provide that much benefit.

    One last point, if anything, the transition to Intel processors has
    raised the profile of Apple in the publics consciousness. People
    don't really care what is in their machines, but they associate
    Intel with quality. Apple is about quality over price. The article
    focus is based too much on the price of things rather than the
    quality of things.

    Of crouse windows fan boys will defend their platform and Mac
    fan boys will do the same, but ask the general public a couple of
    years back what they think of a mac and they will say things like
    "expensive, quality, incompatible, robust'. Ask this year and you
    might find that the perception is changing due to the Intel
    transition, office for Mac and the Mac mini to 'not too expensive,
    quality, compatible,robust'.
    Ask an average windows user what they think of Windows/Dell
    and they will probably say 'cheap, viruses, spyware, monopoly,
    plug and play and play and play'.
    johnadurcan
    • Re: Don't understand the doom & gloom of this article

      It's simple, these pieces attract readers, and secondly, help the
      anti-Apple companies spread FUD about Apple.
      YinToYourYang-22527499
  • Rotten Apple Cores

    Everyone is allowed to screw up upon occassion, it keeps things real.

    They were in a bad place with Motorola, not buying enough product to really warrant the continued business, and M would have taken huge heat for summarily dropping them, so Apple did the right thing by quitting before they were fired.

    (Plus, Mr. Jobs was probably too busy buying Disney to concentrate on the implications of going to lesser hardware.)

    In a year or three, the x64 chips will be out in force, and things will perk up again. Apple will be able to get through this rough patch just as did MicroSloth when they released ME because they "had to have something out there".

    It's simply a case of inertia. Apple can't just take a sabbatical not put products out until the x64 chips are ready. Yes, they're putting out a sub-standard product now, but the halo is still there, and we will have faith that they'll do the right thing as soon as they practically can.

    Best guess is they won't lose too much market share to this, as the folks that used them in the high-end areas already have equipment up and running, and with PC-based boxes not able to compete either, there will be really no reason for that market to upgrade. We're going to see plenty of sales of converts & folks getting their first 'puter. Once they're rolling with the x64 chips, everyone will forget this little episode of theirs, and their perceived slot as the holder of the think-different, high-end machine will return.
    _jman
    • After all that all I can say is...

      Huh?
      BitTwiddler
  • Wow!

    What a load of, er, nonsense. I can't believe the hooey coming
    out of the trade press.

    Apple knows what they're doing, and people really don't care
    about what's inside. They were concerned with the fact that IBM
    couldn't build a G5 chip for laptops!

    What part of that don't you understand? IBM could give them G5
    chips out the wazoo and a $10 bill to Apple for each processor
    they offload on them, and they still would have slow, slow, slow
    laptops!

    OS X wasn't fully 64 bit anyway. Meron chips later this year ARE
    64 bit. Look how Apple upgraded the processors of the MacBook
    Pro before they even started shipping! You think Apple won't be
    ready to add better faster chips as they come out of Intel?

    Oh, and "Intel Inside" is a marketing program that Apple chose
    not to participate in. It's not a chip that goes in a computer.
    Copy editors napping?
    ewelch
    • Intel doesn't even participate in "Intel Inside" any more

      The program has been dropped.
      BitTwiddler
      • Are you sure?

        re "Intel doesn't even participate in "Intel Inside" any more"

        I saw an ad for computers (I think it was Dell) on TV last night and it closed with the Intel "chime" (slightly revised) along with the new Intel branding logo.
        MacCanuck
    • 1 PPC cycle =~ 1.75 Xeon cycles, 3.4 P4 cycles

      So 1 G5 at 2.1Ghz = 3.6Ghz Xeon or a dual core 3.2Ghz P4. (See various previous blogs)

      oh, and the 970Fx runs 13W at 1.65Ghz...

      So why did Apple drop IBM? That's in tomorrow's blog.
      murph_z
      • Netburst

        the Xeon and P4 your talking about are Netburst Cores, which intel is discontinuing, and Apple has not stated they will use yet. more than likely Apple will only stick to the newer cores.
        doh123
  • It's just a start Paul, Jeez...

    Lighten up.
    BitTwiddler