Lepton Image Compression Achieves 22% Lossless Compression of JPEG Images (on Average)

Dropbox stores billions of images on their servers, most of them JPEGs, so if they can reduce the size of pictures it can have a big impact on their storage requirements, so the company developed Lepton image compression, which – on average – achieved 22% lossless compression on the images stored in their cloud.

Lepton_CompressionCompression and decompression speed is also important, since the files are compressed when uploaded and uncompressed on the fly when downloaded so that the complete process is transparent to the end users, who only see JPEG photos, and the company claims 5MB/s compression, and 15MB/s compression, again on average.

The good news is that the company released Lepton implementation on Github, so in theory it could also be used to increase the capacity of NAS which may contain lots of pictures. So I’ve given it a try in a terminal window in Ubuntu 14.04, but it can be built on Windows too with Visual Studio:


If everything goes well for the last step, all tests should be successful:


I also installed it in my path with:


Now let’s go to some directory with photos I took with a DSLR camera:


44 pictures totaling 264 MB. I’ll compress them all, but first, let’s try with one to check the size difference and see if it is indeed lossless.


The Lepton file is definitely smaller (21.66% smaller):


Now let’s uncompress the file and see if there’s any difference:


The diff did not generate any output, so the compression is indeed lossless.

Time to compress all 44 photos:


I did so on a machine with an AMD FX8350 processor, and 8 cores were used during compression. The command took 5 minutes and 8 seconds, or about 7 seconds per picture. What about the size?:


That’s 207 MB down from 264 MB, or about 21.6% compression.

Via Phoronix

Support CNX Software - Donate via PayPal or become a Patron on Patreon

3
Leave a Reply

avatar
3 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
2 Comment authors
tkaiser Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
tkaiser
Guest
tkaiser

The blog post explaining the algorithm is interesting since it works with predictions so I would assume it’s optimized for JPEGs spitten out of mobile phones and DCams (this will make up 99.99% of images stored on DropBox). Will test that later with JPEG images created with optimized encoders we use in various workflows (it’s always a trade-off between speed, image quality and size).

BTW: It should read just ‘./configure’ (without .sh here) and compilation fails on ARM:
In file included from src/vp8/decoder/boolreader.hh:33:0,
from src/vp8/decoder/boolreader.cc:15:
src/vp8/decoder/../model/numeric.hh:11:23: fatal error: smmintrin.h: No such file or directory
compilation terminated.
Makefile:1814: recipe for target ‘src/vp8/decoder/boolreader.o’ failed
Since lepton can be streamed I wanted to check whether I can switch to that and add transparent decompression/conversion on-the-fly on my webserver (A20 Lime2 soon)

Member

That was very informative. I’ve been hoarding so many images, that I’m compelled to try it out as well! Many thanks.

tkaiser
Guest
tkaiser

Should now work on other platforms than x86 (SSE4 instructions used) when doing