Why "good enough" simply isn't with laptops

Why "good enough" simply isn't with laptops

Summary: This weekend The New York Times published its annual catalog of the Year in Ideas. One of them, Good Enough is the New Great, is a concept derived from a story in the August issue of Wired (The Good Enough Revolution), which noted that some of the most successful gadgets and applications of late are a triumph of mediocre technology over the latest and greatest.

SHARE:

This weekend The New York Times published its annual catalog of the Year in Ideas. One of them, Good Enough is the New Great, is a concept derived from a story in the August issue of Wired (The Good Enough Revolution), which noted that some of the most successful gadgets and applications of late are a triumph of mediocre technology over the latest and greatest.

The Flip's success stunned the industry, but it shouldn't have. It's just the latest triumph of what might be called Good Enough tech. Cheap, fast, simple tools are suddenly everywhere. We get our breaking news from blogs, we make spotty long-distance calls on Skype, we watch video on small computer screens rather than TVs, and more and more of us are carrying around dinky, low-power netbook computers that are just good enough to meet our surfing and emailing needs. The low end has never been riding higher.

Lately I've been thinking a lot about this concept of good-enough computing. The success of netbooks in 2009 seems like an obvious example of good over great, but I'm not convinced that is what the netbook phenomenon is really all about. I think it has a lot more to do with demand for highly mobile computing at an affordable price. No sooner had netbooks hit the big-time then chipmakers began trying to address performance shortcomings such as the inability to play HD video. AMD's ultra-thin platform (formerly known as Congo), Nvidia's Ion chipset and Intel's upcoming Pine Trail platform are all designed to boost performance of netbooks.

In general, I think there's still room for significant improvement in performance and battery life of laptops. It's true that the typical $600 mainstream laptop on the shelf at Best Buy can handle most tasks. And there's more choice than ever in terms of size and weight, price, and performance. But if you think about it, we're still far from having it all in one laptop. Netbooks and thin-and-lights based on ultra low-voltage chips are highly portable, and have excellent battery life, at the expense of performance. Budget and mainstream laptops are priced right and have decent performance, but they are too bulky and battery life is poor. If you really want the best performance, you can choose notebook with an Intel Core i7 quad-core (Clarksfield) processor, but these are generally available only in expensive 17-inch desktop replacements that are marginally portable and designed largely for gamers. (Yes, Dell's 15-inch Alienware M15x also comes with Core i7, but it weighs in at more than 10 pounds.)

HP deserves credit for trying to build a laptop that has it all, but the Envy 15 illustrates just how hard impossible this is to do using current technology. The Envy 15 is reasonably portable, measuring one inch thick and weighing 5.4 pounds, and it has a Core i7 quad-core processor and a 1920x1080 15.6-inch display. But all that comes at a price. The Envy 15 starts at $1,800 with a 1.60GHz Core i7-720QM, 6GB of memory, ATI Mobility Radeon HD 4830 graphics and a 500GB hard drive. The Envy 15 is also one of the few 15.6-inch laptops you'll find that doesn't include an internal optical drive to keep the size and weight down. And while performance was extremely good on CNET's tests, battery life was not. Numerous other reviews noted that the Envy 15 runs so hot that it is actually "uncomfortable to use."

The near-term roadmap doesn't offer much hope for closing this gap. In early 2010, Intel will release its first 32nm Westmere processors, including Arrandale for laptops and Clarkdale for desktops, but these will be designed for mainstream laptops, where the bulk of the sales are. In addition, these processors will include Intel's integrated graphics on the same chip for the first time. This will simplify system design, and should help lower prices, but unless Intel has made huge strides in the performance of its integrated graphics, these Arrandale-based laptops won't satisfy power-users. Of course, the 32nm chips can also be paired with discrete graphics, and later in the year we should get faster Westmere chips that come a bit closer to Clarksfield. AMD will release its first laptop platform (Danube) with a quad-core processor in the first half of next year, but like Clarksfield, AMD's 45nm chip is likely to be too big and hot for anything but desktop replacements. Meanwhile AMD's next ultra-thin platform, Geneva, and the Fusion chip with an integrated GPU that arrives in 2011, are targeted at the mainstream, and not the performance segment. The bottom line: Don't look for a thin 13.3-inch laptop that offers Core i7-level performance and solid battery life anytime soon.

Last week, I attended a semiconductor conference where chipmakers discussed their latest technology. There was a lot of talk at the show about a coming slowdown in the pace of innovation. Each new generation of process technology is tougher and more costly than the last, the argument goes, so does it really make sense to stay on this treadmill? But Intel execs--who noted (repeatedly) that the company has already shipped more than 200 million processors using high-k and metal gate technology while the rest of the industry is still figuring it out, and is already manufacturing 32nm chips in two factories--said they have no plans to let up. That's good news because today's laptop technology isn't anywhere near good enough.

Topics: Hardware, CXO, Laptops, Mobility, Processors, IT Employment

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

105 comments
Log in or register to join the discussion
  • Good enough vs one size fits all (or everything)

    There is no such thing as "perfect". If you are going around town shopping, en econobox is fine. If you have to move, you need a truck.

    There will always be screen and keyboard trade off issues with small form factors. There will always be a power vs weight/battery life trade off. It does not matter how powerful and efficient you make the chips. If you install less powerful/smaller ones you can reduce battery size and weight OR increase battery life.

    These are some of the fundamental "laws" of physics. Pretending otherwise and waiting for perfection is folly.

    Hence, good enough is just that. Anything more is overkill for the job.
    Economister
    • Good enough

      Just as there will never be the 'perfect' car for speed, hauling, comfort, fuel e/c, etc. , there will never be the 'perfect' computer.
      RCM_z
  • Depends.

    I don't think it's so much that "good enough" is taking over. I think it's that "good enough" is finally becoming reasonably priced.

    Technology has been able to effectively ignore inflation - the price of many electronics falls much faster than inflation.

    So instead of having the normal effect of causing prices to increase - inflation in the tech market effectively increases the purchasing power of the consumer.

    Many people used to balk at getting a $300 PDA. That was totally out of the middle class in the early '90s. But today, a $300 netbook is considered totally reasonable in a portion of the middle class.

    Even with this recession, most people still have a good amount of purchasing power for electronics.

    The "good enough" IMHO comes largely from an increasing number of people where the technology is secondary, and not central to their lives.

    I [b]don't[/b] think the demand for higher end machines is going down any more than what can be blamed on the depression.

    I think that the number of power users is the same, it's just that the number of non-power users is increasing.

    I think there will always be a demand for more high-end stuff from the power users.
    CobraA1
    • Yup

      Me being one of them.
      The one and only, Cylon Centurion
    • Good enough or commoditization?

      You make some really good points.

      What the author is really talking about is that laptop computers have become commodities. Just as desktop computers did in the 1990's.

      Commodity pricing makes entry-level systems accessible to the middle class.

      This drives prices down across the board - and CAN cause innovation to suffer because profit margins are too small. Still there will always be people willing to pay a premium for something more. (Be it an Apple Macintosh or a Dell with a Core i7 processor.)
      M Wagner
  • Always compromise with laptops

    Laptops by their nature are NEVER going to satify everybody.

    While all want portability, everyone differs in what they want to LOSE to get it.

    It is weight vs battery life/power consupmtion vs computational power vs keyboard quality vs display size vs display resolution VS PRICE!
    There are just too many variables to be anywhere near one size fits all.

    With desktops/towers, it is mainly computational power vs expandibility vs PRICE.

    Netbooks revolutionised the compact form factor as they made it affordable, but mainly at the expense of low resolution displays, spongy keyboards and no optical drive, though the latter is not such an issue with the prevalence of wireless home networks.

    When hunting for a laptop a few years ago, I was looking to get one that was not too heavy as I wanted to be able to not have to think about its weight as to whether I took it with me just in case (being mostly provided with a desktop at a lot of contracts). The desktop replacements were just too much like hard work to lug around.

    I went for a 1.2kg Fujitsu P7010 at ~AU$3000 with 1280x768 bright screen, DVD burner, but only 1GB RAM and 1.2GHz, ULV, single-core CPU. Pretty much a modern netbook with optical and better screen. It worked well for years (until it finally died after trying to strengthen the plastic around the hinges - a typical laptop failure point). I could still edit large documents and spreadsheets and use Access without much difficulty, so I would expect most netbooks to do the same.

    I replaced it with a Sony P Series with an 8", 1600x768 bright screen and 600g weight, because I wanted even more compact and less weight, to make it a no brainer to always take it with me to work. It will even run Win 7 with Aero Glass and open and navigate the 5,200 and 4,600 page Word documents I have done. I wanted the high resolution so I could use it (with glasses, of course) if needed without having to pan the display for web sites or documents.
    Patanjali
    • agreed

      My eyes can't take the 10.1 screen
      ThinkFairer8
  • Netbooks are not that underpowered.

    My Aspire One beats my DESKTOP in specs! My desktop which I use most often, has 512MBs of RAM, a 1.5Ghz P4, and a 80GB HDD. My Netbook has twice the RAM, a slightly but not largely faster CPU, a 160GB HDD, and you know what, both run linux (not stripped down, full CentOS Linux 5.4, yes, I am the type of crazy bastid who puts an enterprise distro on a netbook.) at a VERY satisfying speed. More power is great, and these days often neccessary, but it does what I want, which is web, music, graphics, and software design. The day it is no longer able to do what I want is the day it is obsolete, and the day I upgrade my desktop's RAM. (and maybe a new CPU.) My netbook will at this time be a vintage machine I will likely use for coding.
    Subsentient
    • They say Atom performance doesn't reflect clock speed

      I like having a 1.6 GHz processor in my netbook, but the figure is misleading. I'm told it's about the same as a desktop processor of half the speed in GHz. The low-power processor takes more ticks to do the same work. Now if you're using a proper benchmark tool, I accept that. Anyway, are you using speech recognition with all that power? I bet you aren't. But why not? Why isn't everybody?
      Robert Carnegie 2009
      • Why Bother?

        Unless you are writing a novel, it's far easier for most people to just plod along typing 20-50 wpm for 1-5 sentence emails vs dictating, re-reading and editing/correcting any mistakes before sending.

        Doctors, Lawyers and the like can use one of the many specialized dictation services for a higher level of accuracy (especially with a density of technical/latin/greek terms).

        While it might be a useful benchmark, I doubt netbook makers are trying to capture the dictation/speech to text market - its too narrow.
        Gritztastic
        • Speech because human hands aren't designed to operate keyboards all day.

          Speech because human hands aren't designed to operate keyboards all day.

          I found that out the hard way. Presently I'm using a stylus, and nifty software called Fitaly.

          Also because a keyboard limits how you use your device, and also its size and shape - if you want a keyboard large enough to use comfortably. Alternatively, you can choose a device whose keyboard is NOT large enough to use comfortably, if keyboard is not your main interface.
          Robert Carnegie 2009
    • Old desktop . . .

      That looks like an old desktop . . .

      I'm running a Core 2 Quad system at 2.4 GHz, which is probably a bit closer to what the average system specs is for most computers.

      . . . although most notebooks are still the Duo. But better to be a bit overpowered than underpowered, I guess.

      The netbook's architecture is more reflective of older Pentiums, so comparing clock speeds isn't really all that good of a measure.

      . . . although current Atoms have hyperthreading, which makes up for some of that.

      "both run linux"

      That's not really saying much. Linux isn't really known for pushing anything to its limits. It tends to be a bit more conservative when it comes to adding new features that take up more memory or CPU.

      "but it does what I want, which is web, music, graphics, and software design."

      You must not do very much in the way of heavy graphics. Graphics is one of those things where having several gigs of memory really helps. A setup like that might be fine for occasional touch ups on family photos, but I wouldn't try professional graphics design on it.

      "and software design"

      Well, that's nice, but I personally prefer that my software and hardware setup reflect the software and hardware of my user base.

      So I take it graphics and coding are more like hobbies than really serious work. I personally wouldn't like that setup for graphics and coding.
      CobraA1
      • Old desktop that works

        I use my netbook for DJ'ing. Plenty of power for that. I have another that I use for software development. I love it for that, because I know that the reference is at the absolute bottom end of what my client will be using. So if it runs well on the netbook, it will never be an issue on the desktop. If the form layout is un-cluttered and usable on the netbook, it will look great and be usable on the desktop.

        Would I like to code on my i7 that I also have? Sure, but that would not be remotely representative of the end result.
        trent1
    • Correction to your post

      First, your desktop and laptop aren't underpowered - for YOUR needs. Hence the point of the above posts - noone's needs are the same, and in my applications, your hardware would be so woefully underpowered I'd want to scream and pull my hair out. But its 'good enough' for you, so kudos!

      Second, by the time you consider your desktop obsolete, the price of getting that legacy RAM and CPU are going to be astronomical. Does it even run DDR ram? Because I tried to get new DDR ram for my old build, and it turned out that DDR2/3 was actually cheaper at this point, simply because there are so little remaining DDR chips for sale. And never mind getting a CPU for a legacy chip-set. My socket 939 2.2GHz dual core AMD is selling for more now on eBay then brand new, much faster chips. There is a point that tech becomes so old, its vintage and it does appreciate in price - within reason. The result of this effect is that it is, in fact, cheaper for me to get a brand new mobo/RAM/CPU then it is for me to upgrade my old CPU/ram. I imagine, considering the age of your hardware, that you are in the same boat as I.

      "The views expressed here are mine and do not reflect the official opinion of my employer or the organization through which the Internet was accessed."
      gnesterenko
    • Aspire sounds underpowered to me.

      Your Aspire runs Linux, which also means it doesn't run Windows or Mac applications. Most users are not programmers, and want sophisticated GUIs that hide the computer's complexity. They don't want to run apps from the command line.
      gypkap@...
      • Linux and CLI (Command Line Interface)

        A GNU/Linux distro CAN be entirely CLI-based, OR it can be a more powerful GUI than either Windows or Mac.

        I have been running Ubuntu since version 6.06 (June 2006), and the only times I've had to go into CLI was when my wireless card wasn't given proper drivers by the manufacturer, and when I set up NFS-shares on my network. A network that actually works, as opposed to Windows that doesn't. I once had three computers with Windows side by side when setting up shares between them. Followed the Microsoft manual to A T on each in turn. When I was done, two of the computers could communicate ONE folder to eachother, the third couldn't even SEE the network. When I wanted to share folders in Ubuntu, I right-clicked the folder, clicked Share... , chose a share-name and connected to it from 4 computers. Linux-computers mind you. Windows, even though I used SMB/CIFS, Windows' own file-/foldersharing protocol, could still not see it. Since Windows can't recognise its own protocol, and NFS works better for GNU/Linux, I've chosen to use CLI to install it.

        For me, working with Windows computers is some of the worst I can do. Especially Vista. No matter the version. I kinda like 7 actually.

        Point is, Linux can be as user-friendly/GUI-fied as you like (Ubuntu, Linux Mint comes to mind), CLI-driven (Gentoo, Slackware), anything in between, or even completely from scratch: Linux From Scratch (LFS), meaning you design your system EXACTLY how you want it from the bottom up.

        Think of Linux as Legos. You can start with just the basic blocks (LFS), you can start with some preassembled blocks (CLI-driven) or custom blocks (GUI). With Windows you get GI Joe for your Lego-set, and with Mac you get Barbie. (Now, no flames please. GI Joe and Barbie are also customisable, but not to the same degree that Legos is. We can agree on that, right? You can't change Barbie into Godzilla for example. Barbie will be Barbie no matter what. Legos can be anything you can imagine.)

        And lastly, Linux can run Windows applications. WINE is one program that lets you do that. There are others too. I haven't heard of a program that lets you run Mac-applications on Linux, but that is quite possibly because I've never had a Mac and therefore no programs I wish to run in Linux, and as a result haven't looked for. Not to say that the world of Free Open Source Software hasn't got native applications of its own that are just as good, sometimes better than their Windows/Mac counterparts. Off course, some areas are less developed than others, so there are some areas where Windows/Mac are better than Linux.

        Windows IS best at games. Linux is best for Multimedia, as shown by the prevalence of Linux computers in music and film studios, and the software in many HTPCs. (apx 90% of Hollywood is Linux) As I've never had a Mac, I don't know what it's best at, but it MUST be something to defend the high prices. Oh, wait, I got it... Everything for Mac/from Apple "just works" with other Apple-products!
        OttifantSir
  • Not a new concept

    This has been applied to software for the last 20 years. Beta testers are non-existent as demands on price need to be satisfied. Satisficing is the result. It does not have to be perfect, it just has to be good-enough. As we move closer to perfection, the cost goes exponential.
    happyharry_z
  • Moore's Law, and God's Kindle

    I think Moore's Law originally described advances in integrated circuitry but subsequently drove it. If your product wasn't improving at the Moore's Law rate then you had to spend more on research and development until it was. And we have people now doing things with atoms and electrons that used to be the stuff of science fiction. Which is what science fiction is for, to anticipate things. Except for Daystrom computers in Star Trek that use base three arithmetic, those are just dumb. (Then again, look what happens in Star Trek when computers get smart. Sometimes they only sexually harass the captain in the workplace, sometimes they're The Borg - I digress.)

    As for laptops being inadequate, what the heck do you want? A touchscreen Kindle with speech recognition and HD video? One for me too, please. Meanwhile, your laptop probably has speech rec already - unless it's an XP netbook, in which case you can legitimately obtain Speech SDK 5.1 redistributable inside CoolInfo 1.1, which after free registration declares that you don't have dictation function, but as far as I can see you do, from the SDK itself. (Which you aren't allowed to install on its own except for actual development, just as you aren't allowed to install a Palm or Windows CE development emulator merely to run apps.)
    Robert Carnegie 2009
    • Sci-Fi = "anticipating"? Not quite...

      "Which is what science fiction is for, to anticipate things."

      I believe sci-fi is for envisioning new products and processes and (for the great unlaid masses) entertaining. Sci-fi asks the age-old question, "Wouldn't it be cool if...?" Perhaps that's "anticipating" but "anticipating" sounds more like awaiting something that's known to be coming out any day soon.
      chas_2
      • Because it bugs me when people use "science fiction" for "impossible".

        You know when people say "Colonising the moon... that's science fiction." Well, bad example there. Maybe "Extracting oil in Antarctica... that's science fiction." It may even be science fiction but that does not mean "ain't gonna happen". Come to think, it's the negative use that bugs me more. "Ten years ago we thought the Internet was science fiction." Carefully designed fake quote, because hey - ten years ago Al Gore was about to run for President and most maor news services had web site editions. (And Al Gore has what to do with the internet? Look it up.) Okay. SIXTY years ago the Internet WAS science fiction. http://en.wikipedia.org/wiki/A_Logic_Named_Joe Voila. And yes. Here we are, online. It was science fiction AND IT HAPPENED. ...uh, not exactly like in the story. Mostly. But many of our Internet anxieties are reproduced in that particular, whimsical, foresighted tale.
        Robert Carnegie 2009