←back to thread

A new PNG spec

(www.programmax.net)
618 points bluedel | 5 comments | | HN request time: 1.202s | source
Show context
ksec ◴[] No.44375919[source]
It is just a spec on something widely implemented already.

Assuming Next gen PNG will still require new decoder. They could just call it PNG2.

JPEG-XL already provides everything most people asked for a lossless codec. If there are any problems it is its encoding and decoding speed and resources.

Current champion of Lossless image codec is HALIC. https://news.ycombinator.com/item?id=38990568

replies(9): >>44376276 #>>44377213 #>>44377942 #>>44378201 #>>44378431 #>>44378924 #>>44379251 #>>44379537 #>>44379896 #
1. HakanAbbas ◴[] No.44379896[source]
I don't really understand what the new PNG does better. Elements such as speed or compression ratio are not mentioned. Thanks also for your kind thoughts ksec.

Apart from the widespread support in codecs, there are 3 important elements: processing speed, compression ratio and memory usage. These are taken into account when making a decision (pareto limit). In other words, the fastest or the best compression maker alone does not matter. Otherwise, the situation can be interpreted as insufficient knowledge and experience about the subject.

HALIC is very good in lossless image compression in terms of speed/compression ratio. It also uses a comic amount of memory. No one mentioned whether this was necessary or not. However, low memory usage negatively affects both the processing speed and the compression ratio. You can see the real performance of HALIC only on large-sized(20 MPixel+) images(single and multi-thread). An example current test is below. During operations, HALIC uses only about 20 MB of memory, while JXL uses more than 1 GB of memory.

https://www.dpreview.com/sample-galleries/6970112006/fujifil...

June 2025, i7 3770k, Single Thread Results

----------------------------------------------------

First 4 JPG Images to PPM, Total 1,100,337,479 bytes

HALIC NORMAL: 5.143s 6.398s 369,448,062 bytes

HALIC FAST : 3.481s 5.468s 381,993,631 bytes

JXL 0.11.1 -e1: 17.809s 28.893s 414,659,797 bytes

JXL 0.11.1 -e2: 39.732s 26.195s 369,642,206 bytes

JXL 0.11.1 -e3: 81.869s 72.354s 371,984,220 bytes

JXL 0.11.1 -e4: 261.237s 80.128s 357,693,875 bytes

----------------------------------------------------

First 4 RAW Images to PPM, Total 1.224.789.960 bytes

HALIC NORMAL: 5.872s 7.304s 400,942,108 bytes

HALIC FAST : 3.842s 6.149s 414,113,254 bytes

JXL 0.11.1 -e1: 19.736s 32.411s 457,193,750 bytes

JXL 0.11.1 -e2: 42.845s 29.807s 413,731,858 bytes

JXL 0.11.1 -e3: 87.759s 81.152s 402,224,531 bytes

JXL 0.11.1 -e4: 259.400s 83.041s 396,079,448 bytes

----------------------------------------------------

I had a very busy time with HALAC. Now I've given him a break, too. Maybe I can go back to HALIC, which I left unfinished, and do better. That is, more intense and/or faster. Or I can make it work much better in synthetic images. I can also add a mode that is near-lossless. But I don't know if it's worth the time I'm going to spend on it.

replies(1): >>44385537 #
2. account42 ◴[] No.44385537[source]
> In other words, the fastest or the best compression maker alone does not matter.

Strictly true, but e.g. for archival or content delivered to many users compression speed and memory needed for compression is an afterthought compared to compressed size.

replies(1): >>44387324 #
3. HakanAbbas ◴[] No.44387324[source]
Storage is cheaper than it used to be. Bandwidth is also cheaper than it used to be (though not as cheap as storage). So high quality lossy techniques and lossless techniques can be adopted more than low quality lossy compression techniques. Today, processor cores are not getting much faster. And energy is still not cheap. So in all my work, processing speed (energy consumption) is a much higher priority for me.
replies(1): >>44387492 #
4. boogerlad ◴[] No.44387492{3}[source]
You're right, but aren't you forgetting that for each image, the encode cost needs to be paid just once, but the decode time must be paid many many times? Therefore, I think it's important to optimize size and decode time.
replies(1): >>44390084 #
5. HakanAbbas ◴[] No.44390084{4}[source]
HALIC's decode speed is already much faster compared to other codecs. When you look at the compression ratios, they are almost the same. There doesn't seem to be a problem with this. There are also issues where encode speed is especially important. But I think there is no need to spend a lot more energy to make a few percent more compression and decode it.