←back to thread

A new PNG spec

(www.programmax.net)
614 points bluedel | 1 comments | | HN request time: 0s | source
Show context
qwertox ◴[] No.44373847[source]
> Officially supports Exif data

Probably the best news here. While you already can write custom data into a header, having Exif is good.

BTW: Does Exif have a magnetometer (rotation) and acceleration (gravity) field? I often wonder about why Google isn't saving this information in the images which the camera app saves. It could help so much with post-processing, like with leveling the horizon or creating panoramas.

replies(7): >>44373955 #>>44373957 #>>44373987 #>>44375555 #>>44376496 #>>44382700 #>>44384608 #
Aardwolf ◴[] No.44373955[source]
Exif can also cause confusion for how to render the image: should its rotation be applied or not?

Old decoders and new decoders now could render an image with exif rotation differently since it's an optional chunk that can be ignored, and even for new decoders, the spec lists no decoder recommendations for how to use the exif rotation

It does say "It is recommended that unless a decoder has independent knowledge of the validity of the Exif data, the data should be considered to be of historical value only.", so hopefully the rotation will not be used by renderers, but it's only a vague recommendation, there's no strict "don't rotate the image" which would be the only backwards compatible way

With jpeg's exif, there have also been bugs with the rotation being applied twice, e.g. desktop environment and underlying library both doing it independently

replies(1): >>44374585 #
DidYaWipe ◴[] No.44374585[source]
The stupid thing is that any device with an orientation sensor is still writing images the wrong way and then setting a flag, expecting every viewing application to rotate the image.

The camera knows which way it's oriented, so it should just write the pixels out in the correct order. Write the upper-left pixel first. Then the next one. And so on. WTF.

replies(4): >>44374826 #>>44375663 #>>44376252 #>>44378679 #
Someone ◴[] No.44378679[source]
> The camera knows which way it's oriented, so it should just write the pixels out in the correct order. Write the upper-left pixel first. Then the next one. And so on. WTF.

The hardware likely is optimized for the common case, so I would think that can be a lot slower. It wouldn’t surprise me, for example, if there are image sensors out there that can only be read out in top to bottom, left to right order.

Also, with RAW images and sensors that aren’t rectangular grids, I think that would complicate RAW images parsing. Code for that could have to support up to four different formats, depending on how the sensor is designed,

replies(2): >>44382729 #>>44385564 #
1. account42 ◴[] No.44385564[source]
Sensors are not read out as JPEG but into intermediate memory. The encoding step can then deal with the needed rotation.

RAW images aren't JPEGs so not relevant to the discussion.