MS Office 2007 versus Open Office 2.2 shootout

MS Office 2007 versus Open Office 2.2 shootout

Summary: After yesterday's blog about the relevance of feature bloat, I figured that I would follow up with some quantitative analysis on the performance characteristics to measure resource bloat. This isn't the first time I've measured Office CPU and memory consumption of Microsoft Office and Open Office.

TOPICS: Microsoft

After yesterday's blog about the relevance of feature bloat, I figured that I would follow up with some quantitative analysis on the performance characteristics to measure resource bloat. This isn't the first time I've measured Office CPU and memory consumption of Microsoft Office and Open Office. I have a whole series on it dating back to 2005. This time, I'm pitting Microsoft-backed OOXML (Office Open XML) versus the OASIS-backed ODF (OpenDocument) format with Microsoft Office 2007 and Open Office 2.2.

Before I start, I'm going to disclose the hardware, OS, and software I'm using to measure these two Office suites.

  • Intel Core 2 Duo 2.13
  • 2 GB DDR2-800
  • ATI X800 PCI-Express Video Card
  • 500 GB SATA-II hard drive housing the sample files
OS and software:
  • Windows Vista
  • Microsoft Sysinternals Process Explorer (resource measurement)
  • Microsoft Office 2007
  • 2.2


Baseline measurements for opening Application

CPU time (milliseconds)


Number of I/O

KernelUserTotalPeak KBReadWriteOther
MS Excel2343285622430814101422
OO.o Calc6255931218477883641213106
MS Word17139056231776136131957
OO.o Writer343687103146700365813120
MS Access484531101525836129967
OO.o Base781906168749984170817622832

Office 2007 base memory consumption went up significantly compared to the Office 2003 I measured last year, but it's still significantly less than 2.2.  Some of the applications, like Base, require Java to run, and the memory consumption spikes over 70 megabytes as soon as you start navigating in the interface. However, the difference between Microsoft and base resource consumption has gotten smaller. Next, we test the CPU and memory utilization of Microsoft Excel and Calc when opening the same 16-sheet test file.


Opening large spreadsheet

CPU time (milliseconds)


Number of I/O

KernelUserTotalPeak KBReadWriteOther
XLS (MS)2652046231211554839172376
XLSX (MS)296124061270365548687191854
ODS (OO.o)96858875598432536808992215822

Comparison of versus Office 2007 resource consumption

From these results, we can see that the ODF XML parser (while vastly improved) is still about 5 times slower than Microsoft's OOXML parser. also seems to consume nearly 4 times the amount of RAM to hold the same data.  While continues to have fewer features than Microsoft Office, it continues to consume far more resources than Microsoft.

Even though these results still show drastic differences in CPU and memory consumption between MS Office 2007 and 2.2, it's not as extreme as the results measured last year. It would appear that 2.2 has gotten significantly better than version 2.0, but it still has a lot to work on. The official performance-tuning wiki is tracking some of these improvements. I praise their recent efforts and hope they keep it up because it will only bring more competition to the table. So while I may still consider a resource pig, the pig has definitely lost some weight.

Topic: Microsoft

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Agree but...

    for people who can't afford a full office suite, Open Office is great. Though it may not be as great at allocating my computer resources, it does help me allocate a more important resource, my money. =) Nice report and it's good to see the differences.
    • That is the point, with today's hardware, the average user will feel the

      difference, only in their wallet. Don't forget that Vista also uses a ton of resources, for not much kick.

      Meanwhile, as MS adds more and more features, tries to improve every little aspect of performance, going head to head with OpenOffice, they miss that the real competition is going to be Google Apps. They are also competing on the paradigm of formating everything for 8.5x11 paper, as that paradigm is about to be thrown on the stack heap of history.
  • where are the test files?

    George, put a link to the files.
    D T Schmitz
    • I agree...

      ... we don't want any of the [i]"let's compare loading a binary MS Office sheet with a text OpenOffice sheet - my goodness look at the difference in load times"[/i] kind of stunt George pulled last year.

      Oh - and a workbook containing nine sheets of 16,000 rows with one formula per row isn't any more "typical" this year than it was last year.
      • Typical for business use

        A workbook of that size is very typical in large business setting - especially in an accounting or finance shop.

        But OO is pefect for home or even SOHO use where it's unlikely you will ever come across files of that size.
        • This might interest you


          "To ?prove? that OO.o is a pig, Ou offers up a sample file here for users to perform their own tests. So far, so good. But once you get the file down, you may notice something a little bit odd: the file is 3.6 MB?s in size. That?s larger than just about any spreadsheet I?ve seen, but then go one step further and unzip the file, and you?ll discover that the content.xml portion of his sample file explodes to 279.5 MB?s."

          "My brother, as a backgrounder, is I-banking trained in his usage of Excel [1] and currently employed by a relatively well known hedge fund. The files he sent over were financial models of two public companies, and essentially reflect the financial well being of the institutions in question in spreadsheet form. One spreadsheet has 7 sheets and the other 12, and the sizes? 188.5 kb and 294 kb, respectively. ..... I asked him how often he dealt with huge Excel files of the size that Ou featured, and his response was that they were very much the exception to the rule - and his is the profession that conventional wisdom at least says are the true power users of the format."
          • I completely concur with his statements...

            As I work in the industry.. :) It is by far a HUGE exception to the rule that someone has large excel documents that exceed a few megs.

            I've worked with financial/accounting groups in Fortune 500 companies. :P

            Again which is another reason I keep pinpointing Ou's flaw with a 200 meg file. It's completely unrealistic.. It's just being used to exaggerate a difference but when push comes to shove and a normal user uses a normal every day file.. They will hardly notice a second difference if that.
          • Then I guess my execel are not HUGE!

            I have been using excel to keep track of all my accounting. Checking, Debit, Credit cards, house hold bills, mortgages, car payments, and anything else that I left out. My excel file is almost 100 megs. I think it is around 179megs.

            When I go on trips I use Open Office to access this file. I do not care if it uses more, or if it is even slower. I use Excel on my desktop and Open Office on my laptop.

            Having a HUGE excel file is nothing new to me. At work I constantly work with excel files that are around 3 to 4 hundred megs.

            I am sure that the majority of the excel files out there are a megs or less, but there are plenty of Huge Excel files out there.

            Now I know I can use accounting software to do everything I am doing. I have tried and I do not like the accounting software. They do NOT do everything I need it to do.
          • Again..

            Just because you do it, does not mean everyone else does. If we looked at the ratio of 100+ meg files vs everyone's standard.. Who want to bet you're in a minority in comparison to everyone else.
          • I am impressed

            [i]Now I know I can use accounting software to do everything I am doing. I have tried and I do not like the accounting software. They do NOT do everything I need it to do.[/i]

            Seriously, I am impressed that you got a spreadsheet to do more than the limitations of any accounting software.

            I would love to see all of the formulas and macros(?) that you had to create. That must have taken a lot of work.

            I am not being sarcastic. That must have been a herculean undertaking.
          • That is the worst way to use excel..

            So I hope you were being sarcastic.. Everyone knows there is a threshhold with excel and scripts, excel in general when you open and close the same document constantly.. It eventually corrupts.
          • And there you have it

            Another day in George Ou's World.
            Next topic--Linux slow boot times, probably.
            Yeh definally. Yeh. (Marathon Man)
            D T Schmitz
          • er ah (Rain Man)

            D T Schmitz
          • 30MB is routine

            I've worked in several accounting and financial shops and routinely see spreadsheets that exceed 30MB with 20-30 worksheets. But a 300MB spreadsheet is just idiotic. Although I know they exist - I've seen them - that just means the users are stretching the spreadsheet way beyond what it was intended for. We had to start forcing our forecasting folks to archive their data or use a database because they were trying to send these files via email.
          • Even 20-30 megs

            Isn't bad, and you wouldn't see a huge footprint like comparing it to a 200+ meg file like George continually compares it too.

            And about forecasters.. I totally agree. I'm doing that sort of process now with our forecast people here. :)
          • One has to use a file of sufficient size ...

            A lot has been made about the file formats used, and the size of the file used to conpare the load times, but, bear in mind that one has to use a file of sufficient size in order to be able to detect a difference that can be precieved by the human eye.

            I don't want to dipute George's methods, motives, nor his results - I'll take his numbers at "face value".

            So, when opening a 279 Mb spreadsheet, Excel could do it in a mere 2.3 seconds, while Calc took 60. That is a fairly significant difference.

            However, if we can assume that there is something approaching a linear relationship between file size and the time it takes to open that file, the same two programs opening a file one-tenth the size could be expected to take 0.2 and 6 seconds respectively - plenty of difference for one to see.

            But, it an even more common file size of 2.79 Mbytes were to be used, then those numbers would drop to 0.02 and 0.6 seconds - you might have trouble telling the difference. And, if a truely typical file size were tested, 279 Kb, then the 0.002 and 0.06 second times would be impossible for anyone to notice.

            George's numbers are probably a fair reflection of the comparability of the two suites. So, if you deal with 100+ Mb files, you're definately going to want Office. But, if you use 100+ Mb spreadsheet files, you might also want to consider if a datebase might not be a better vehicle for storing, organizing and retrieving that much data.
          • Finally, some perspective

            Now, I understand that even though the test was accurately represented, the results are generally meaningless. That none of us would ever notice the difference, except in our bank accounts in using one or the other.
          • OO.o also provied tools to reduce filesize

            If I have files of this size, I would use a relational database. In contrast with Access, OO.o's Base doesn't suck. I state this as someone who has made a fair amount of money writing Access databases. Access hasn't evolved in about a decade, while Base has a number of features that make it much better as either a front end to an Enterprise database or as a stand-alone database.

            Many large text documents can be logically broken into chapters. With OO.o Write, it is easy to have each document stored in a separate file and to build a Master Document that links the Subdocuments.

            Finally, I can put OO.o onto a blade and have users access virtual PC's on the blade. Then, using thin clients (using whatever scavenged hardware that will support a modern monitor), I can provide many users with access to OO.o. Since OO.o will be in memory (at least after 8:02 AM), it will load fast. Since our documents are on a SAN, we can read them very fast. While it's true we could do this with Office, the licensing is (relatively) expensive and a PITA.

            So, while it is true that OO.o is slower than MSO for large documents stored on a PC, OO.o offers tools and deployment options that can effectively compensate for this issue.

            Besides, OO.o is evolving much faster. If OO.o were to use something like Cubework's BXML, we could see the tables turn rather quickly. See
          • Dang fast HD, doncha' think?

            Wow! "So, when opening a 279 Mb spreadsheet, Excel could do it in a mere 2.3 seconds, while Calc took 60. That is a fairly significant difference."
            Nearly 280MB in 2.3 seconds?hat's over 120MB/sec. And that's just for opening the file. I'd sure like to know what hard disk he's using! Granted, 60 seconds for opening the same file (I'd assume it is?) is no speed freak(4.65MB/sec), but perhaps there's more going on than we're seeing. Maybe Excel is merely "opening" the file (i.e. getting the pointers, reading a few cells) whereas OOo is reading the whole file or perhaps the whole sheet before presenting the page. This might help account for some of the difference in observed time and observed system resource usage.
            After all, since Excel is a closed, proprietary program, we've no clues as to what's actually happening. While it would be a fairly clever trick to ONLY do the calculations on those cells which are "visible" (after all, who'd know, since you can't see the others), one test for this would be to have nearly all cells depend upon data in other cells, maybe on other sheets (there's 16,000 cells x 9 sheets, right?) to see how "fast" each of them is. For example, Sheet1.A2 would contain a number that Sheet2.A2 might need in a formula, which Sheet3.A2 would need for its formula, etc. and Sheet1.A3 would need data that was calculated from Sheet9.A2, etc. but Sheet1.A1 would need data from Sheet9.A15999. Now THAT would be a test! And it wouldn't be too hard to set up, either.
            Anyone want to bet that the 2.3 second "opening" gets blown away?
          • 2 seconds is for the 50 MB XLS file

            2 seconds is for the 50 MB XLS file which works out to about 25 MB/sec which is correct.