Vista "bloated"? Not exactly...

Vista "bloated"? Not exactly...

Summary: One popular tech website says that Windows Vista gobbles up 800MB of RAM just to get started. They're wrong. They're also missing the larger point.

TOPICS: Hardware

Sometimes I think the world would be a better place if people had to pass a test before they could hang out their shingle as a "tech journalist."

Today's case in point comes from The Inquirer, which reported yesterday that "even while idling, Vista eats as much as 800Mb of system memory." The evidence? A screenshot showing the Windows Vista Task Manager with 820MB of memory in use.

Well, without knowing which processes were actually running on the machine in question, I couldn't pass judgment on what was going on with that system. I do know that on a test system here in my office, a computer running Windows Vista Business uses less than 460MB of RAM when it first starts up. Another test computer with 768MB of RAM installed (that's less than the alarming 800MB minimum suggested in the aforementioned article, please note) has had no trouble handling any task I throw at it.

Of course, it's way too early to even be asking, much less answering, questions of performance with Windows Vista, which hasn't even reached the Beta 2 milestone. Its release date is more than six months away. Trying to benchmark an early beta is silly, because performance tuning is relatively low on the list of developers' priorities, and there's still debug code in each build, which adds to the memory footprint and won't be in the final release.

But this article misses a much bigger point. If you have a gigabyte of RAM on your computer, how much should be in use at any given time? The correct answer is "as much as possible." RAM is many times faster than disk space. The whole point of the operating system is to manage that memory and to swap program files and data in and out of memory as smoothly as possible. By aggressively using existing RAM to cache and page chunks of data, the OS can make maximum use of those resources. After you've been using a computer for a few hours, it should settle in at a comfortable working set that is lower than the amount of physical RAM installed in the box. If you continually find yourself exceeding that threshold and waiting while the OS swaps out to disk, it's probably time to throw some extra RAM in the box. But there's no evidence that's happening to the chap who sent in that screenshot.

I don't have to go too far out on a limb to predict, with confidence, that Windows Vista will run very well on any new system with 1GB of RAM installed and will perform acceptably on the current generation of computers with 512MB of RAM. Is that an unacceptable burden? Everything's relative, I suppose. I remember the howls of anguish in 1995 over the fact that Windows NT required 8MB of RAM just to start up, and 64MB to run acceptably. Is there any modern operating system that will perform acceptably on that platform today?

Topic: Hardware

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • and Microsoft is accused of FUD...

    The open source community needs to get a grip on reality as do many "sensational" journalists. When Microsoft says anything against anyone (especially linux) then MS is the evil empire that must be destroyed at all costs.

    But when linux/open source advocates say anything negative against MS, it's par for the course and MS should take it laying down.

    This reminds me of the days when the linux crowd used to accuse MS of never innovating. They would say that MS just bought out other companies for their technologies and MS never made anything of its own. And to think linux is a *nix clone...

    • Excelent point

      John Zern
    • Get real


      Linux Is Not UniX

      • As usual

        As usual someone has to miss the point or be totally wrong.

        Linux IS a Unix clone. Just ask Linus the original author. He has stated it numerous times.
    • just because MS is the King Of FUD ...

      ... doesn't mean others can't try their hand at it.

      How can you think of suggesting Vista isn't bloated when it
      requires over ten times the RAM that [b]server[/b] software
      demanded a decade ago? Sure, things have changed. OSX -
      Tiger, where many of the innovations originally planned for Vista
      (nee Longhorn) were introduced - demands 256 MB of RAM
      while OS 9 got by very well on ... was it 64 or 32? Panther -
      SOoo Last Year - only required 128 MB, so Tiger MUST be
      'bloated'. [b]Bloated is as bloated does[/b], if you can run your
      software and still have a bit of RAM to spare then you can often
      ignore bloat ... if NOT, then, THEN, it's a problem.

      Mac types have just had a run-in with journalists (and forum
      pundits) of the 'ignorant' and/or 'sensationalist' stripe. They got
      all sorts of wonky over the First OSX Virus. That wasn't a virus.
      That didn't attack OSX. They also got wonky about someone able
      to escalate their user account to root status in half an hour ...
      touted it as 'Breaking In ... OSX Hacked'. The story, of course,
      was quite different and a less idiotic 'test' told quite a different

      MS is guilty of FUD. Get over it. Others are sometimes guilty of
      FUD. What else is new? MS is not big on innovation. Get over it.
      They take ideas from others and recycle them to their own taste
      and needs. This is hardly news. Apple, amazingly innovative
      when compared to MS, takes ideas from others and recycle them
      to their own style and needs - and often improves them
      considerably in the process. They are still other people's ideas.
      The Linux folk are not serious innovators either ... they take
      other people's ideas and improve them, dump the garbage, fix
      them, tweak them and generally make them what they should
      have been in the first place ... generally bloat-free. Linux is
      really about not paying folks like MS bloated fees for bloated
      and buggy software, instead paying modest (or [b]no[/b]) fees
      for leaner, cleaner code.

      Just my nickel's worth.
      • Improve on your legal risks

        Don't you get it, unless you have a big budget you cannot take others ideas and improve on them. Because you might get hit with a big copyright or patent law suit. You need deep pockets for the lawyers, and when did Linux ever have deep pockets.

        Linux has to raw innovate, without the benefit of prior art.
    • King of FUD

      What about "Linux is a cancer", "Linux is communism", I wonder who said that, and are these objective statements, or plain old emotional FUD?
  • Message has been deleted.

    Valis Keogh
    • Can you even read?

      The estimates are 512MB will run Vista just fine, so where you pull this 2gig crap from is anyone's guess.
      Second, if on startup, Vista allocates all available RAM, then hands it out to apps as it is needed, it will work perfectly fine as well. It is called memory management, not bloat.
      Ah well, never let the facts get in the way!

      The mem usage display in Task Manager has been the source of confusions for years and years. The mem usage displayed is not actually what is being used!!! But rather what has been *requested* and given to an application... the virtual Working Set. If you load up OpenOffice for the FIRST time on a Windows machine with 256mb you might see it take up 6megs in Task Manager. While if you have 1gb of RAM you might see it's taking up 40mb. It doesn't mean ANYTHING... it's EMPTY SPACE being "reserved." If Windows has it it gives it to the app.

      On a 2GB RAM machine, apps "*SEE*" a big 5GB, yes 5GB (or more), of RAM playland all for *themselves*... because Windows tells them so. It's all virtual and windows manages TRUE RAM usage behind the scenes.
  • I've run Vista Beta on 256 MBs

    It runs ok on 256. You're absolutely right that an OS should use as much RAM as possible so that the responsiveness is good. It?s also funny that the article doesn?t show you the top memory eaters sorted by size. A few Java applications can easily eat up that much RAM. can go through memory like water.
    • Run, as in...

      Started it up? Or do you mean that you had a couple of Word documents open with perhaps a spreadsheet open as well, or what?
      • As in Word and Spread Sheets

    • 16 Meg for Windows 2000

      We used to test software distributions on Windows 2000 with only 16 meg of memory. Sure it ran. But, who wanted it that way???
  • Save the endangered RAM!

    Ed - thanks for reiterating this point (you should get a medal for your relentless attempts to explain this to people). There is no point to having a resource like RAM if it sits unused. The issue is not how much memory Vista (or any other OS) "consumes" on startup. The issue is what it does with it and how well it shares it. That's a key function of the OS - to act as a referee and broker resources to applications and services as needed.

    Having a lot of RAM and not using it is kind of like driving one of those nasty SUV monstrosities with only one person and a sack of groceries in the vehicle - a total waste of resources.
    • big trucks

      considering most people have large trucks for a reason.i never knew a honda civic could pull 8000 pounds.
  • As I have mentioned before

    and been accused of FUD, anything less than 1Gig RAM and you will just not be in happy land with Vista. I know, I know, all the 'marketing' paperwork said only 256MB ram is required, but in the real world, that means 1Gig. Live in reality. If you want Vista, get 1 Gig. Heck, I put 6.5 Gig into my dual 2 GHz G5. OS X 'requires' only 256MB as well, but without 1Gig, you probably should only run 1 application at a time. This is the same with Vista. Don't let the marketing hype convince you that Vista will run adequately on anything less.
    • SInce vista is still in BETA

      you might as well be reading tea leaves to make this prediction.
  • Another collection of

    words forming information-less text.

    One measure of interest is the Resident Set Size (RSS) in the data segments of a process. Ill programmed memory hogs end up with a lot of this.

    If Vista is going to be bloated or not is a kind of MOOT point since Vista is still a NON product (yes dear Beta testers I know you have access to "vista")

    With the $$/GB going down for DRAMs and Disks the size of memory footprint is not such a big issue. More important is the speed at which a processor can access the data in memory and to I/O. This is were the bottleneck is in the current PC technology.

    I am sure that mr OU will forcefully "convince" us that Vista is a lean mean icon machine and everything else sucks big time.

  • You base everything on the wrong factors.

    Trying to base it on the hardware itself is misleading.

    A more appropiate measure would be to compare the cost of a machine say 5 years ago to run the latest version of Windows and one you can buy today to run Vista.

    When you do this you see the cost of a PC equiped to run Windows (Vista) is roughly 2/3 the cost it was 5 years ago for WIndows 2K. (All things being roughly even.)

    Why is this important? Because the vast majority of Vista installs will come via new machines. (The vast majority of PCs die/are given away with the same OS it had when it originally shipped.)

    But what about those wishing to upgrade? The same applies here too. The cost of the additional RAM is a very minor cost. I mean heck, you can buy a stick of RAM for next to nothing today.

    Yes, you could convert those old Win2K or even XP machines to Linux and be almost on par with Win95, but why would anyone want to?