I'm always surprised to see bugs like this where an extremely easy to test part of the spec just seemingly isn't tested and ends up as a bug that never gets fixed until many years later.
I'm always surprised to see bugs like this where an extremely easy to test part of the spec just seemingly isn't tested and ends up as a bug that never gets fixed until many years later.
> On-Topic: Anything that good hackers would find interesting. That includes more than hacking and startups. If you had to reduce it to a sentence, the answer might be: anything that gratifies one's intellectual curiosity.
Someone probably thought it was interesting, and based on the fact it's on the front page and receiving comments, at least some other people agree.
I do however think that there are quite a few bugs that might be triaged as "easy" but if worked on would reveal much more serious problems. Which is why some random selection of "easy" issues should make it to work queues.
The way I've seen it implemented at a small company I worked at before was to explicitly endorse the "20% time" idea that Google made famous, where you may choose your own priorities for a fraction of your working time regardless of the bug tracker priority order. Even if in practice you don't actually have that spare time allocated in your schedule, it does give you some cover to tell your manager why you are prioritizing little UI papercuts over product features this week.
It gets even worse when ingesting images into Apple Photos, where you have to confront papercut bugs that you know will never be fixed.
I love ExifTool. It’s one of the great utilities. It works for almost every file I throw at it. But reading its output can be unsettling. It’s like getting a glimpse of eudaimonia, only to have it rudely interrupted by the reality of Apple Photos misreading every lens in your collection.
If the EXIF data specifies a 180° rotation, then start at 0° and gradually increase the rotation by 1° per day until full spec compliance is reached.
You can also have other situations where this is useful like a primarily hardware pipeline that doesn't support rotation, but you can mark the rotation at the end. Although this is probably less of an issue for PNG than formats that typically come out of cameras and scanners.
Well, could be many reasons, "priorities" is usually the reason I see as the top reason for things like that to not be fixed immediately, rather than "we looked into it and it was hard". Second most popular reason is "workaround exists", and then after that probably something like "looks easy but isn't".
I think the solution would be to stop consider "easy-but-isn't" as easy bugs, even if they might appear so. So the "easy bugs" team would have their worklog, and if they discover one of those bugs weren't actually easy and would need large changes, reject it and push it somewhere else, and start working on something that is actually easy instead.
Like, if you had millions of images you needed to rotate on a server in a batch job, then OK.
But if you're just rotating one photo, or even a hundred, that you've just taken, it's plenty fast enough.
You're a new coder and would like to help a project, if possible a big one for your resume? Here are something to get started.
This is only true for cross-origin images, no? Which is expected: you can't access data loaded from another origin unless it's been loaded with CORS.
JPEG rotation only has to be lossy when the image is not evenly divisible into macro blocks - rather than transcoding just rotate the macro blocks, and where they're placed.
Not really. It's hard to see the difference from the outside without actually digging into it first, but in my experience while there's plenty of "easy" bugs that aren't actually easy, there's also plenty of easy bugs that are actually easy and that apparently everyone else assumed they're not, or else they would have been fixed already :P
You cannot just push high prio stuff on people.
Business gets their predictable workload done bonus stuff like things team wants to fix gets of course second seat but it has its place.
This is a perfect expression of something I like to call Chesterton's Inertia. It's exactly the same as Chesterton's Fence.
What happens is that there is a mess on the floor, and somebody walks around it, maybe just because they were in a hurry, or maybe they didn't even see it. Somebody else walks in, maybe doesn't even notice the mess, just notices the faint trail that the last person left, and follows it. The next person walks in, sees a mess, and sees a trod path around the mess, and follows the path.
Years later, the path has been paved, has signage posted that doesn't refer to the mess, and has walls blocking the mess from sight. The mess has fused with the ground it used to just sit on, and is partially being used to support the wall. Every once in a while, someone asks why there's a weird bend in the path that makes no sense, and a old hand who's been around since the beginning tells him that the bend is a fence, and not for you to understand.
Makes sense. I have to imagine there is a performance impact to waiting until you've downloaded the entire image _just in case_ there's some metadata telling you to rotate it right at the end of the stream.
So yeah, I think "Stripping all EXIF metadata doesn't change an image" deserves an entry as a "falsehood programmers believe about...".
It's shame that after so many years of development we ended up with such horrible formats like jpeg and mp4.
Or tragic, but I rather see drama than joy with this approach. The main thing with bugfixing is, that it can affect a whole lot of other areas, or introduce completely new bugs. So both teams then fighting over changes ..
Now a really trivial bug with no side effects, sure thing, no issue, but like a sibling commentor has said, the really trivial bugs are usually fixed already. And quick fixes of seemingly trivial things can induce a world of pain for someone else.
In other words, I think project management and prioritising things remain hard, with no magic bullets solutions avaiable. (But I wpuld also prefer a stronger emphasis on quality control in general, vs new feature)
It’s the same with rotation. Both are essential information on how to interpret the pixel data for display, but we’re so very used to assuming certain defaults that it’s easy to forget about this.
And it has it's own forums with tens of thousands of posts!
"It is recommended that unless a decoder has independent knowledge of the validity of the Exif data, the data should be considered to be of historical value only."
Instead of either saying: "yes you must rotate it" or "no you shall not rotate it" to make everyone do the same thing. And if it were yes, they should also have made this a mandatory chunk since now they made it optional to read.
Thankfully Finder in macOS has a way to remove the flag:
How to remove orientation from portrait photo from iPhone on macOS https://youtu.be/lWOlfjVyes4
I couldn’t find a way to do it in Preview, but Finder could do it.
Working on two or more big features at the same time is not possible. But throwing in some pebbles and dev can take on it.
Cameras should have just rotated the actual image pixels when saving, instead of cheating. If that's too slow, implement it in hardware, or schedule a deferred process and don't let the images be exported until that's done.
No, it was an elegant hack given all the constraints which mostly no longer exist on modern hardware (although I wouldn't be so sure about really small embedded systems).
Sure, modern cameras will have no issues loading the full JPEG into memory, but how would you have implemented this in cameras that only have enough for exactly one line's worth of compression blocks?
> or schedule a deferred process and don't let the images be exported until that's done.
Good luck doing this on a battery-powered camera writing directly to an SD card that's expected to be mountable after removing it from the camera without an intransparent postprocessing step.
Whether or not you fix a bug weighs on the scale against the cost of all of the above things, the cost of time, the cost of these people's attention, and the opportunity cost of them doing something else. And these costs tend to not scale with the size of the pull request. They're fixed costs that have to be paid no matter how small an issue is.
I work at a BigCo, and occasionally get comments from developer friends about "Hey, why doesn't BigCo fix this obvious bug I reported! It's simple! Why are you guys so incompetent??" I look at the bug internally, and it's either 1. got a huge internal comment chain showing it's not as simple as an outsider would think, or 2. it's indeed trivial, but the effort to fix it does not outweigh the costs I outlined above.
https://issues.chromium.org/issues/40448628
When it got fixed, some sites were still depending on the old behavior of not rotating JPEGs, and had to add "image-orientation:none" to explicitly ignore EXIF:
I should probably just give up and let it all be a mess. Not sure I'll be able to though. The only thing that freed me from metadata obsession when it came to my music collection is that I switched to streaming services.
But in general, I do believe that teams should be split on the priority issue in some way. If all you are doing is chasing the highest priority stuff, you're going to miss important things because priority isn't an exact science either.
Would love to see a good rundown of when you should rely on different approaches? Another thread pointed out that you should also use the color space metadata.
IIRC in other cases you have to cut the edge of the image off (rounded to the nearest block) or recompress those edge blocks
What if I want to rotate an image by 90 degrees because my camera didn't correctly detect up & down?
To my understanding rotation is lossless, where as moving the data will incur quality loss (except for certain exceptions).
Other cameras and phones and apps produce images where the device adjusts the aspect ratio and order of the array of pixels in the image regardless of the way the sensor was pointed, such that the EXIF orientation is always the default 0-degree rotation. I'd argue that this is simpler, it's the way that people ignorant of the existence of the metadata method would expect the system to work. That method always works on any device or browser, rotating with EXIF only works if your whole pipeline is aware of that method.
I think this is what you meant by "some systems" there. But, I would expect that of every sensor system? I legit never would have considered that they would try the transpose on saving the image off the sensor.
I've yet to find a business that really, truly knows what it wants. Whatever is "good for the business case" today could change overnight after the President reads some cockamamie article in Harvard Business Review, and again in two weeks after the CEO spends a weekend in Jackson Hole.
It's true that automatic handling of all input images is difficult, but imo it's important to document.
An example I recently encountered is that in neurological imaging, the axes are patient's right, anterior, superior whereas in radiology they are patient's left, anterior superior. Tricky to get right...
http://www.grahamwideman.com/gw/brain/orientation/orientterm...
I suppose it's no coincidence that the native output format of many sensors (or ISPs, to be precise) is divisible by 16 in both width and height.
Oh, this seems to be more or less completely ignored when selecting subtitles, though some players will at least list "English (forced)" or "English (default)" &c. when selecting a subtitle. Quite a pain with dubbed foreign-films when the subtitles are used for translating on-screen text; you really want the forced subtitle in that case!