X
Business

Microsoft's JSMeter: A new way to analyze and affect JavaScript performance

The Microsoft Internet Explorer (IE) team isn't the only group at the company that is delving into the finer points around JavaScript. A couple of Microsoft researchers also are doing work that could change the way Microsoft (and possibly other companies) look at -- and ultimately affect -- JavaScript runtime performance.
Written by Mary Jo Foley, Senior Contributing Editor

The Microsoft Internet Explorer (IE) team isn't the only group at the company that is delving into the finer points around JavaScript. A couple of Microsoft researchers also are doing work that could change the way Microsoft (and possibly other companies) look at -- and ultimately affect -- JavaScript runtime performance.

JSMeter isn't a brand-new project, but I just learned of it via a new Microsoft Channel 9 video on the topic. Ben Livshits and Ben Zorn, two of the Microsoft researchers behind the project, made some interesting points during the part of the interview I watched. They explained how they took an implementation of Internet Explorer and changed the source code to see how they could affect the performance of Web applications, like Bing Maps, FaceBook and Gmail.

The JSMeter team is advocating for new benchmarks that more accurately reflect the true performance of large-scale Web applications like these, claiming that existing benchmarks often have little real-world value. Microsoft's IE team made a similar case last week, during the release of the first developer preview of IE 9 last week. The IE 9 developer preview includes a new JavaScript engine, codenamed Chakra, which is designed to boost performance.

According to one of the JSMeter white papers available on the Microsoft Research site, the team was interested in measuring "two specific areas of JavaScript runtime behavior: 1) functions and code and 2) events and handlers. We find that the benchmarks are not representative of many real web sitesand that conclusions reached from measuring the benchmarks may be misleading."

More from the conclusion of that paper:

"Our measurements suggest a number of valuable follow-up efforts. These include working on building a more representative collection of benchmarks, modifying JavaScript engines to more effectively implement some of the real behaviors we observed, and building developer tools that expose the kind of measurement data we report."

Microsoft will be presenting more about JSMeter during the Web Applications '10 conference in Boston in June.

Editorial standards