←back to thread

A new PNG spec

(www.programmax.net)
616 points bluedel | 4 comments | | HN request time: 0s | source
Show context
ksec ◴[] No.44375919[source]
It is just a spec on something widely implemented already.

Assuming Next gen PNG will still require new decoder. They could just call it PNG2.

JPEG-XL already provides everything most people asked for a lossless codec. If there are any problems it is its encoding and decoding speed and resources.

Current champion of Lossless image codec is HALIC. https://news.ycombinator.com/item?id=38990568

replies(9): >>44376276 #>>44377213 #>>44377942 #>>44378201 #>>44378431 #>>44378924 #>>44379251 #>>44379537 #>>44379896 #
illiac786 ◴[] No.44377213[source]
> If there are any problems it is its encoding and decoding speed and resources.

And this will improve over time, like jpg encoders and decoders did.

replies(2): >>44377499 #>>44385515 #
1. ksec ◴[] No.44377499[source]
I hope I am very wrong but this isn't given. In the past reference encoder and decoder do not concern about speed and resources, but last 10 years have shown most reference encoder and decoder has already put considerable effort into speed optimisation. And it seems people are already looking to hardware JPEG XL implementation. ( I hope and guess this is for Lossless only )
replies(1): >>44378208 #
2. illiac786 ◴[] No.44378208[source]
I would agree we will see less improvements that when comparing modern jpeg implementation and the reference one.

When it comes to hardware encoding/decoding, I am not following your point I think. The fact that some are already looking at hardware implementation for JPEG XL means that….?

I just know JPEG hardware acceleration is quite common, hence I am trying to understand how that makes JPEG XL different/better/worse?

replies(1): >>44380985 #
3. ksec ◴[] No.44380985[source]
In terms of PC usage, JPEG, or most image codec decoding are done via software and not hardware. AFAIK even AVIF decoding is done via software on browser.

Hardware acceleration for lossless makes more sense for JPEG XL because it is currently very slow. As the author of HALIC posted some results below, JPEG XL is about 20 - 50x slower while requiring lots of memory after memory optimisation. And about 10 - 20 times slower compared to other lossless codec. JPEG XL is already used by Camera and stored as DNG, but encoding resources is limiting its reach. Hence hardware encoder would be great.

For lossy JPEG XL, not so much. Just like video codec, hardware encoder tends to focus on speed and it takes multiple iteration or 5 - 10 years before it catches up on quality. JPEG XL is relatively new with so many tools and usage optimisation which even current software encoder is far from reaching the codec's potential. And I dont want crappy quality JPEG XL hardware encoder, hence I much prefer an upgradeable software encoder for JPEG XL lossy and hardware encoder for JPEG XL Lossless.

replies(1): >>44387852 #
4. spider-mario ◴[] No.44387852{3}[source]
Lossless JPEG XL encoding is already fast in software and scales very well with the number of cores. With a few cores, it can easily compress 100 megapixels per second or more. (The times you see in the comment with the DPReview samples are single-threaded and for a total of about 400 MP since each image is 101.8MP.)