Avoiding code rot requires continuous optimization of the software, something that naturally comes at great time and expense, and keeps developers busy performing maintenance rather than new innovations. A project called Helium seeks to automate code optimization.
An initial proof-of-concept focused on Adobe's venerable Photoshop application was completed earlier this year, as the vendor describes in a blog post this week:
To test the idea, Adobe and MIT researchers decided to focus on the optimization of Photoshop filters. But rather than analyze the entirety of the source code of the filters themselves, they decided to analyze the execution tracing -- or only the instructions that are actually run in the CPU when a filter is applied -- to identify which operations are actually used, and which were being applied multiple times or inefficiently.
By comparing the execution trace of the filter to the actual pixel changes that take place on screen when the filter is applied, the researchers were able to construct a high-level representation of original binary code and recreate it in Halide, a modern programming language geared toward image processing. The Halide code is then run through an auto-tuner to optimize it for the latest hardware and recompiled.
Helium was able to improve the performance of a number of Photoshop filters by 75 percent, and by up to 500 percent for IrfanView, a popular freeware image editing package for Windows.
Analysis: Helium Rising?
Helium remains in its infancy, but certainly shows promise. Beyond software vendors, any enterprise with a vested interest in optimizing legacy applications could join the effort. Helium's source code is also available under the open source MIT license, opening up the possibility for code forks and proprietary implementations.
While the initial proof-of-concept focused on Photoshop filters--a particularly rich target given the amount of computing resources they can consume--Helium is clearly relevant for other types of software.
"Project Helium might find applicability in distributed data-analysis platforms," says Constellation Research VP and principal analyst Doug Henschen. "In recent years, database vendors have been taking advantage of both multi-threading and vector processing capabiliites built into CPUs and GPUs. As new chips introduce yet more tricks and performance boosters, it would be great if database query-optimization engines and data-transformation software could be automatically upgraded to tap into the power."
Constellation Insights is an online news service published daily to advise members of theInsights community on the significance and implications of developments in enterprise technology.
Constellation Insights is crafted by leading analysts to go beyond merely reporting on news stories to provide detailed advice to community members on how to evaluate and respond to changes in enterprise technology. Learn more about Constellation Insights.