Most active commenters
  • ksec(4)
  • illiac786(3)
  • account42(3)

←back to thread

A new PNG spec

(www.programmax.net)
616 points bluedel | 28 comments | | HN request time: 1.314s | source | bottom
1. ksec ◴[] No.44375919[source]
It is just a spec on something widely implemented already.

Assuming Next gen PNG will still require new decoder. They could just call it PNG2.

JPEG-XL already provides everything most people asked for a lossless codec. If there are any problems it is its encoding and decoding speed and resources.

Current champion of Lossless image codec is HALIC. https://news.ycombinator.com/item?id=38990568

replies(9): >>44376276 #>>44377213 #>>44377942 #>>44378201 #>>44378431 #>>44378924 #>>44379251 #>>44379537 #>>44379896 #
2. klabb3 ◴[] No.44376276[source]
What about transparency? That’s the main benefit of PNG imo.
replies(1): >>44376995 #
3. cmiller1 ◴[] No.44376995[source]
Yes JPEG-XL has an alpha channel.
4. illiac786 ◴[] No.44377213[source]
> If there are any problems it is its encoding and decoding speed and resources.

And this will improve over time, like jpg encoders and decoders did.

replies(2): >>44377499 #>>44385515 #
5. ksec ◴[] No.44377499[source]
I hope I am very wrong but this isn't given. In the past reference encoder and decoder do not concern about speed and resources, but last 10 years have shown most reference encoder and decoder has already put considerable effort into speed optimisation. And it seems people are already looking to hardware JPEG XL implementation. ( I hope and guess this is for Lossless only )
replies(1): >>44378208 #
6. thesz ◴[] No.44377942[source]
HALIC discussion page [1] says otherwise.

[1] https://encode.su/threads/4025-HALIC-(High-Availability-Loss...

It looks like LEA 0.5 is the champion.

And HALIC is not even close to ten in this [2] lossless image compression benchmark.

[2] https://github.com/WangXuan95/Image-Compression-Benchmark

replies(1): >>44380612 #
7. voxleone ◴[] No.44378201[source]
I'm using png in a computer vision image annotation tool[0]. The idea is to store the class labels directly in the image [dispensing with the side car text files], taking advantage of the beautiful png metadata capabilities. The next step is to build a specialized extension of the format for this kind of task.

[0]https://github.com/VoxleOne/XLabel

8. illiac786 ◴[] No.44378208{3}[source]
I would agree we will see less improvements that when comparing modern jpeg implementation and the reference one.

When it comes to hardware encoding/decoding, I am not following your point I think. The fact that some are already looking at hardware implementation for JPEG XL means that….?

I just know JPEG hardware acceleration is quite common, hence I am trying to understand how that makes JPEG XL different/better/worse?

replies(1): >>44380985 #
9. yyyk ◴[] No.44378431[source]
When it comes to metadata, an implementation not being widely implemented (yet) is not that big a problem. Select tools will do for meta, so this is an advancement for PNG.
10. bla3 ◴[] No.44378924[source]
WebP lossless is close to state of the art and widely available. It's also not widely used. The takeaway seems to be that absolute best performance for lossless compression isn't that important, or at least it won't get you widely adopted.
replies(4): >>44379059 #>>44379693 #>>44379884 #>>44385510 #
11. adzm ◴[] No.44379059[source]
Only downside is that webp lossless requires RGB colorspace so you can't, for example, save direct YUV frames from a video losslessly. AVIF lossless does support this though.
12. Aloisius ◴[] No.44379251[source]
I'll be honest, I ignored JPEG XL for a couple years because I assumed that it was merely for extra large images.
13. ChrisMarshallNY ◴[] No.44379537[source]
Looks like it's basically reaffirming what a lot of folks have been doing, unofficially.

For myself, I use PNG only for computer-generated still images. I tend to use good ol' JPEG for photos.

14. mchusma ◴[] No.44379693[source]
I don't know that i have ever used jpg or png lossless in practical usage (e.g. I don't think 99.9% of mobile app or web usecases are for lossless). WebP lossy performance is just not worth it in practice, which is why WebP never took off IMO.

Are there usecases for lossless other than archival?

replies(2): >>44381619 #>>44388048 #
15. ProgramMax ◴[] No.44379884[source]
WebP maxes at 8-bit per channel. For HDR, you really need 10- or 12-bit.

WebP is amazing. But if I were going to label something "state of the art" I would go with JPEGXL :)

16. HakanAbbas ◴[] No.44379896[source]
I don't really understand what the new PNG does better. Elements such as speed or compression ratio are not mentioned. Thanks also for your kind thoughts ksec.

Apart from the widespread support in codecs, there are 3 important elements: processing speed, compression ratio and memory usage. These are taken into account when making a decision (pareto limit). In other words, the fastest or the best compression maker alone does not matter. Otherwise, the situation can be interpreted as insufficient knowledge and experience about the subject.

HALIC is very good in lossless image compression in terms of speed/compression ratio. It also uses a comic amount of memory. No one mentioned whether this was necessary or not. However, low memory usage negatively affects both the processing speed and the compression ratio. You can see the real performance of HALIC only on large-sized(20 MPixel+) images(single and multi-thread). An example current test is below. During operations, HALIC uses only about 20 MB of memory, while JXL uses more than 1 GB of memory.

https://www.dpreview.com/sample-galleries/6970112006/fujifil...

June 2025, i7 3770k, Single Thread Results

----------------------------------------------------

First 4 JPG Images to PPM, Total 1,100,337,479 bytes

HALIC NORMAL: 5.143s 6.398s 369,448,062 bytes

HALIC FAST : 3.481s 5.468s 381,993,631 bytes

JXL 0.11.1 -e1: 17.809s 28.893s 414,659,797 bytes

JXL 0.11.1 -e2: 39.732s 26.195s 369,642,206 bytes

JXL 0.11.1 -e3: 81.869s 72.354s 371,984,220 bytes

JXL 0.11.1 -e4: 261.237s 80.128s 357,693,875 bytes

----------------------------------------------------

First 4 RAW Images to PPM, Total 1.224.789.960 bytes

HALIC NORMAL: 5.872s 7.304s 400,942,108 bytes

HALIC FAST : 3.842s 6.149s 414,113,254 bytes

JXL 0.11.1 -e1: 19.736s 32.411s 457,193,750 bytes

JXL 0.11.1 -e2: 42.845s 29.807s 413,731,858 bytes

JXL 0.11.1 -e3: 87.759s 81.152s 402,224,531 bytes

JXL 0.11.1 -e4: 259.400s 83.041s 396,079,448 bytes

----------------------------------------------------

I had a very busy time with HALAC. Now I've given him a break, too. Maybe I can go back to HALIC, which I left unfinished, and do better. That is, more intense and/or faster. Or I can make it work much better in synthetic images. I can also add a mode that is near-lossless. But I don't know if it's worth the time I'm going to spend on it.

replies(1): >>44385537 #
17. poly2it ◴[] No.44380612[source]
It looks like HALIC offers very impressive decode speeds within its compression range.
replies(1): >>44381031 #
18. ksec ◴[] No.44380985{4}[source]
In terms of PC usage, JPEG, or most image codec decoding are done via software and not hardware. AFAIK even AVIF decoding is done via software on browser.

Hardware acceleration for lossless makes more sense for JPEG XL because it is currently very slow. As the author of HALIC posted some results below, JPEG XL is about 20 - 50x slower while requiring lots of memory after memory optimisation. And about 10 - 20 times slower compared to other lossless codec. JPEG XL is already used by Camera and stored as DNG, but encoding resources is limiting its reach. Hence hardware encoder would be great.

For lossy JPEG XL, not so much. Just like video codec, hardware encoder tends to focus on speed and it takes multiple iteration or 5 - 10 years before it catches up on quality. JPEG XL is relatively new with so many tools and usage optimisation which even current software encoder is far from reaching the codec's potential. And I dont want crappy quality JPEG XL hardware encoder, hence I much prefer an upgradeable software encoder for JPEG XL lossy and hardware encoder for JPEG XL Lossless.

replies(1): >>44387852 #
19. ksec ◴[] No.44381031{3}[source]
And not just decoding speed but also encoding speed with difference of an order of magnitude. Some new results further down in the comments in this thread. Had it not been verified I would have thought it was a scam.
20. Inityx ◴[] No.44381619{3}[source]
Asset pipelines for media creation benefit greatly from better compression of lossless images and video
21. account42 ◴[] No.44385510[source]
Last I checked cwebp does not preserve PNG color space information properly so the result isn't actually visually lossless.
22. account42 ◴[] No.44385515[source]
Or it won't like JPEG 2000 encoders didn't.
replies(1): >>44386595 #
23. account42 ◴[] No.44385537[source]
> In other words, the fastest or the best compression maker alone does not matter.

Strictly true, but e.g. for archival or content delivered to many users compression speed and memory needed for compression is an afterthought compared to compressed size.

replies(1): >>44387324 #
24. illiac786 ◴[] No.44386595{3}[source]
I mean, if jxl becomes mainstream, of course.
25. HakanAbbas ◴[] No.44387324{3}[source]
Storage is cheaper than it used to be. Bandwidth is also cheaper than it used to be (though not as cheap as storage). So high quality lossy techniques and lossless techniques can be adopted more than low quality lossy compression techniques. Today, processor cores are not getting much faster. And energy is still not cheap. So in all my work, processing speed (energy consumption) is a much higher priority for me.
replies(1): >>44387492 #
26. boogerlad ◴[] No.44387492{4}[source]
You're right, but aren't you forgetting that for each image, the encode cost needs to be paid just once, but the decode time must be paid many many times? Therefore, I think it's important to optimize size and decode time.
27. spider-mario ◴[] No.44387852{5}[source]
Lossless JPEG XL encoding is already fast in software and scales very well with the number of cores. With a few cores, it can easily compress 100 megapixels per second or more. (The times you see in the comment with the DPReview samples are single-threaded and for a total of about 400 MP since each image is 101.8MP.)
28. kbolino ◴[] No.44388048{3}[source]
I definitely noticed when the Play Store switched to lossy icons. I can still notice it to this day, though they did at least make it harder to notice (it was especially apparent on low-DPI displays). Fortunately, the apps once installed still seem to use lossless icons.

A lot of images should be lossless. Icons/pictograms/emoji, diagrams and line drawings (when rasterized), screenshots, etc. You can sometimes get away with large-resolution lossy for some of these if you scale it down, but that doesn't necessarily translate into a smaller file size than a lossless image at the intended resolution.

There's another problem with lossy images, which is re-encoding. Any app/site that lets you upload/share an image but also insists on re-encoding it can quickly turn it into pixelated mush.