X
Tech

Vista RTM vs. Vista SP1 - Office 2007 benchmarking

Enough with benchmarking the OS - let's see if Office 2007 is any faster on Vista SP1.
Written by Adrian Kingsley-Hughes, Contributing Writer

Enough with benchmarking the OS - let's see if Office 2007 is any faster on Vista SP1.

The test

The tests will be carried out on the AMD Spider platform that I have set up in the lab (Phenom 9700, Radeon 3850 graphics card, 2GB of RAM ...).  I've used this system as the platform for a number of benchmarks I've run over the past week (for a full spec, see this post).

As for the tests, we took two identical images of Vista - one RTM, one SP1.  We then loaded Microsoft Office 2007 Professional onto the system and applied all the patches (including Office 2007 SP1).  The system was then defragged and rebooted several times. 

Afterwards we downloaded and installed DMS Clarity Studio software which includes the OfficeBench software.  This application comes with a comprehensive set of Microsoft Office test scripts.  These test scripts, while not perfect (no benchmark solution is) are pretty good and simulate a number of real-world tasks that users might carry out in Microsoft Office.  The metric that we're interested in getting from this benchmark is how long OfficeBench takes to execute the test scripts.

All tests were run five times under no load and five times under load (here OfficeBench runs a small Windows Media Video in the background), and we rebooted the system between each run.  The tests were duplicated on Vista RTM and Vista SP1.

Because the results were so well grouped for this test, no results were discarded and times taken to execute the scripts were averaged.  The lower the time, the faster the scripts completed and the better the result.

The Results and Conclusions -->

The results

As I mentioned earlier, we carried out two sets of tests on Vista RTM and Vista SP1 - one set of tests under no load and the other when the system was playing a small, looped WMV file (to be honest though, this doesn't put the system under much load at all, but it does offer a background task for the system to work on while the test is in progress). 

The results as shown below:

  Time to execute test scripts (sec) - No load Time to execute test scripts (sec) - Load
Vista RTM 265.83 271.06
Vista SP1 273.76 273.97
 Conclusions

Since we received very consistent results across the tests,  it's clear to me the impact that Vista Service Pack 1 has on the performance of Microsoft Office 2007 applications - almost none at all.  The 8 second variation in the average score between Vista RTM and Vista SP1 under no load is not going to be noticeable during normal usage.  This is good news for those who feared that SP1 might slow down Office applications, but bad news for those expecting a performance boost. 

Under load the difference is even smaller - just under 4 seconds. 

What's interesting to see is how close the no load and load scores are for Vista SP1 - only a fraction of a second separates the two scores.  This may be an indication of SP1's better responsiveness when under load.  Earlier tests I've done seem to also confirm that Vista SP1 is more responsive under load.

 Note: Here are links to previous Vista benchmark posts: 1 - Some systems showing incredible SP1 performance boost | 2 - Vista SP1 vs. XP SP2 - Benchmarked3 - Vista SP1 vs. XP SP2 - Part Deux | 4 - Vista 32-bit vs. Vista 64-bit - Benchmarked | 5 - Vista: 32-bit vs. 64-bit & RTM vs. SP1.

Thoughts on Benchmarking -->

Thoughts on benchmarking

I've been benchmarking systems for long enough to know that no matter how many questions I think that my results answer, what I'm really doing is creating about three new questions for each question I solve.  This is what happened with my earlier run of Vista benchmarks - I'd run some tests, you'd then come back and offer different scenarios that you'd like to see done and different platforms for those scenarios.

Benchmarking is an artificial activity.  The goal is to eliminate as many variables as possible and achieve some consistent metric.  Problem is, by removing the variables you're actually shifting the process out from reality and into a make believe land which only exists on the PCs being benchmarked.  Before you carry out your daily PC tasks I bet that you don't take elaborate steps to ensure that you get consistency.  Hence my enthusiastic use of the phrase "your mileage WILL vary."

Another fundamental problem with benchmarking is that neither the tests nor the results are exactly what people want.  Ultimately, what everyone wants to see is a benchmark of their daily tasks carried out on their PC.  That, I'm afraid, is something I cannot provide.

Thoughts?

<< Home >>

Editorial standards