Welcome to our virtual newstand, full of web news, business tips and more. Please hop in the discussion and leave a comment.
Get blog updates delivered straight to your inbox, subscribe to our RSS Feed:

Image Squeezing Must for Good Web Developers Part 2


 

In first part we have seen that there are various image formats that make web developers’ job easy to optimize the website for performance particularly for downloading time. Now continue with that we will explore more trick and techniques to make effective optimization through image squeezing. Web developers generally want to achieve image optimization opportunities without any compromise on quality. Using the jpegtran and pngcrush utilitiesweb developers can run lossless optimizations on JPEG and PNG as well as conversion to lossless WebP. The results would be fascinating like:

  • JPEG EXIF removal optimization technique-6.6% reduction
  • JPEG EXIF removal, optimized Huffman optimization technique-13.3% reduction
  • JPEG EXIF removal, optimized Huffman, Convert to progressive-15.1% reduction
  • PNG8 pngcrush -2.6% reduction
  • PNG8 lossless WebP-23% reduction
  • PNG24 pngcrush-11% reduction
  • PNG24 lossless WebP -33.1% reduction
  • PNG24a pngcrush-14.4% reduction
  • PNG24a lossless WebP-42.5% reduction

About 12.75% of image data can be saved with these lossless optimization techniques. This is 101KB for a page! That means we can save up to 18.2% of overall image data for browsers that support them and if we use the lossless variant of WebP we can do up to 144KB.

Now let’s see for the sake of data saving what happens we are willing to vaguely compromise quality. In order to get an objective idea of the trade-off web developers make between quality and byte size if web developers used the SSIM index. As low the SSIM means the bigger difference between thememory used by the images.

JPEG

If web developers compressed JPEGs, several levels of quality using ImageMagick can be achieved. In order to squeeze these images some more after that they can apply the lossless optimizations that we saw above to them. After doing all the things stated above, we also compressed the images using imgmin that is a utility that sets up binary search to find the ideal quality level for each tested image. Finally, we can run the JPEG to WebP conversion to see if the reimbursement matches Google’s result of 30% data reduction.

The results were awesome like:

  • For 75 Quality Level-Data Reduction 50%-SSIM 96.22%
  • For 50 Quality Level-Data Reduction 64.6%-SSIM 92.28%
  • For 30 Quality Level-Data Reduction 73.3%-SSIM 89.13%
  • For imgmin Quality Level-Data Reduction 38.6%-SSIM 97.52%
  • For WebP 75 Quality Level-Data Reduction 68%-SSIM 95.28%

Web developers can get compression levels close to “quality 30? with “quality 75? image quality using WebP according to cited data above. Looking from different perspective WebP files are 37% smaller than the size of JPEGs with equivalent quality.

Now you might have a question in your head after examine these data that why web developers don’t compress their images. The apparent reason is that there is no automated process exists to do with less time and cost, but we can do that at different levels like:

  • In build time: this way we can make sure that no image would be make it through without compression
  • In upload time: for dynamic additions of the images in CMS web developers can utilize the upload process to compress the images
  • Before serving: This last option that web developers can compress the images before serving to the end users.