One of the biggest reasons for why websites with lots of images and video content can load so slowly is because of the sizes of those media files. JPEGs are particularly irritating in this regard because they are so widely used. Now, Google is making a few changes that are meant to address this issue, which could potentially help websites load faster.
The new development is thanks to an algorithm called Guetzli, which directly relates to JPEG file sizes, Ars Technica reports. This open source platform can reportedly shrink the file sizes of the media type by up to 35% without losing its integrity or quality. Considering just how much processing power this saves, it can provide a considerable boost to the loading times of pages containing JPEG images.
Along with reducing the sizes of an existing file, the algorithm can also be used to enlarge files while keeping their sizes the same. It’s an undeniably useful feature that is just dying to be used by the internet as a whole, which users would likely agree it should be.
As to how it actually manages to do this, it would appear that the secret lies with organizing large batches of data in order to make compression much easier. It’s what the researchers call quantization and with the help of a psychovisual model called Butteraugli, it becomes easier to identify which colors and details are worth keeping or scrapping.
With regards to the speed boost of website loading time that users will see, it can be difficult to tell without actually making comprehensive tests. However, it’s practically guaranteed that mobile users who have limits on their data plans will see a dip in their consumption if they are fond of browsing data-demanding sites on their devices, BGR reports. As with the internet loading speed, the amount saved is likely to vary.