Unlike Google's previous would-be PNG and JPEG replacement, WebP, and its RAISR photo-compression technique, the new 'perceptual' encoder doesn't veer from the JPEG standard and is compatible with existing browsers and image-processing applications.
Guetzli targets a stage in JPEG compression processes called quantization where visual quality is traded off for a smaller file size.
The new open-source encoder "strikes a balance between minimal loss and file size", according to Google, making these sacrifices less noticeable to users, while also trimming down increasingly large files on the web.
Google hopes the new encoder will be used by image-heavy websites, allow website operators to offer a smoother browsing experience, and cut bandwidth costs for mobile users. It may also present a way of improving video compression.
ZDNet sister site CNET explains that a second and equally important component of Guetzli's speed boost comes from a test called Butteraugli, which helps automate the testing of different compression settings and can simultaneously compare two different compression methods.
"Butteraugli is a project that estimates the psychovisual similarity of two images. It gives a score for the images that is reliable in the domain of barely noticeable differences. Butteraugli not only gives a scalar score, but also computes a spatial map of the level of differences," Google notes in its GitHub page for Butteraugli.
The one downside of Guetzli is that the use of search algorithms takes "significantly longer" to compress image file sizes than existing methods.
On the other hand, research with human raters found they "consistently preferred" Guetzli-processed images over comparably sized and slightly larger libjpeg files. Therefore, Google judges its slower compression method as worth it.
VIDEO: Google's new JPEG encoder to deliver high-quality images in smaller files