This worries me. Because presumably, changing the compression algorithm will break backwards compatibility, which means we'll start to see "png" files that aren't actually png files.
It'll be like USB-C but for images.
This worries me. Because presumably, changing the compression algorithm will break backwards compatibility, which means we'll start to see "png" files that aren't actually png files.
It'll be like USB-C but for images.
The first bit of our research is "What can we already make use of which requires no spec update? There are plenty of PNG optimizers. How much of that should go into the typical PNG libraries?"
Same with parallel encoding & decoding. An older image viewer will be able to decode it on one thread without ever knowing parallel decoding was an option.
Here's the worry-a-little part: Everybody immediately jumps to file size as to what image compression is better or worse. That isn't the best take, but it is what it is. So there is pressure to adopt newer technologies.
We often do have a way to maintain some degree of backwards compatibility even when we do this. For example, we can store a downsampled image for old viewers. Then extra, new chunks will know "Mix that with this full scale data, using a different compression".
As you can imagine, this mixing complicates things. It might not be the best option. Sooooo we're researching it :)