←back to thread

361 points Tomte | 6 comments | | HN request time: 0.932s | source | bottom
Show context
ChrisMarshallNY ◴[] No.43609745[source]
Raw decoding is not as simple as you might think.

It’s the best place to add “signature steps.” Things like noise reduction, chromatic aberration correction, and one-step HDR processing.

I used to work for a camera manufacturer, and our Raw decoder was an extremely intense pipeline step. It was treated as one of the biggest secrets in the company.

Third-party deinterlacers could not exactly match ours, although they could get very good results.

replies(6): >>43609759 #>>43610604 #>>43611686 #>>43615373 #>>43615559 #>>43623272 #
koiueo ◴[] No.43609759[source]
Can you share what company have you worked for?
replies(2): >>43609772 #>>43609884 #
ChrisMarshallNY ◴[] No.43609772[source]
Not publicly. It’s not difficult to figure out, but I make it a point, not to post stuff that would show up in their search algorithms.

But it was a pretty major one, and I ran their host image pipeline software team.

[Edited to Add] It was one of the “no comment” companies. They won’t discuss their Raw format in detail, and neither will I, even though it has been many years, since I left that company, and it’s likely that my knowledge is dated.

replies(2): >>43610379 #>>43616006 #
Zak ◴[] No.43610379[source]
> They won’t discuss their Raw format in detail

Can you share the reason for that?

It seems to me that long ago, camera companies thought they would charge money for their proprietary conversion software. It has been obvious for nearly as long that nobody is going to pay for it, and delayed compatibility with the software people actually want to use will only slow down sales of new models.

With that reasoning long-dead, is there some other competitive advantage they perceive to keeping details of the raw format secret?

replies(1): >>43612116 #
1. ChrisMarshallNY ◴[] No.43612116[source]
The main reason is that image Quality is the main coefficient of their corporation. They felt that it was a competitive advantage, and sort of a "secret ingredient," like you will hear from master chefs.

They feel that their images have a "corporate fingerprint," and are always concerned that images not get out, that don't demonstrate that.

This often resulted in difficulty, getting sample images.

Also, for things like chromatic aberration correction, you could add metadata that describes the lens that took the picture, and use that to inform the correction algorithm.

In many cases, a lens that displays chromatic aberration is an embarrassment. It's one of those "dirty little secrets," that camera manufacturers don't want to admit exists.

As they started producing cheaper lenses, with less glass, they would get more ChrAb, and they didn't want people to see that.

Raw files are where you can compensate for that, with the least impact on image quality. You can have ChrAb correction, applied after the demosaic, but it will be "lossy." If you can apply it before, you can minimize data loss. Same with noise reduction.

Many folks here, would absolutely freak, if they saw the complexity of our deBayer filter. It was a pretty massive bit of code.

replies(2): >>43612418 #>>43616115 #
2. Zak ◴[] No.43612418[source]
Thanks for the explanation. I have to question how reality-based that thinking is. I do not, of course expect you to defend it.

It seems to me that nearly all photographers who are particularly concerned with image quality shoot raw and use third-party processing software. Perhaps that's a decision not rooted firmly in reality, but it would take a massive effort focused on software UX to get very many to switch to first-party software.

> Raw files are where you can compensate for that, with the least impact on image quality. You can have ChrAb correction, applied after the demosaic, but it will be "lossy."

Are you saying that they're baking chromatic aberration corrections into the raw files themselves so that third-party software can't detect it? I know the trend lately is to tolerate more software-correctable flaws in lenses today because it allows for gains elsewhere (often sharpness or size, not just price), but I'm used to seeing those corrections as a step in the raw development pipeline which software can toggle.

replies(1): >>43612559 #
3. ChrisMarshallNY ◴[] No.43612559[source]
I think we're getting into that stuff that I don't want to elaborate on. They would probably get cranky I have said what I've said, but that's pretty common knowledge.

If the third-party stuff has access to the raw Bayer format, they can do pretty much anything. They may not have the actual manufacturer data on lenses, but they may be able to do a lot.

Also, 50MP, lossless-compressed (or uncompressed) 16-bit-per-channel images tend to be big. It takes a lot to process them; especially if you have time constraints (like video). Remember that these devices have their own, low-power processors, and they need to handle the data. If we wrote host software to provide matching processing, we needed to mimic what the device firmware did. You don't necessarily have that issue, with third-party pipelines, as no one expects them to match.

replies(1): >>43616230 #
4. porphyra ◴[] No.43616115[source]
I am very skeptical that chromatic aberration can be applied before a demosaic and then the result can be stored in a Bayer array again. There seems to be no advantage in storing the result of chromatic aberration correction in a raw Bayer array, which has less information, than a full array with the three RGB values per pixel. Perhaps I am not understanding it correctly?
replies(1): >>43617522 #
5. Zak ◴[] No.43616230{3}[source]
Thanks for sharing what you could. I wasn't really thinking about video; the storage requirements to work with raw video are indeed big.
6. ChrisMarshallNY ◴[] No.43617522[source]
It's not stored. It's applied to the raw Bayer data, every time, before demosaicing. Same with noise reduction.

What you can store, is metadata that informs these "first step" filters, like lens data, and maybe other sensor readings.

One of the advantages to proprietary data storage, is that you can have company-proprietary filters, that produce a "signature" effect. Third-party filters may get close to it (and may actually get "better" results), but it won't be the same, and won't look like what you see in the viewfinder.