←back to thread

361 points Tomte | 1 comments | | HN request time: 0.202s | source
Show context
kookamamie ◴[] No.43609045[source]
DSLRs have just dropped off the wagon a long time ago, when it comes to software and especially meaningful UX innovation.

As an anecdote, I have a Sony a7r and operating it via its mobile app is one of the worst user experiences I have had in a while.

Same goes to the surrounding ecosystem of software. E.g. Adobe's Lightroom is full of obsolete paradigms and weird usability choises.

replies(5): >>43609098 #>>43609221 #>>43609343 #>>43609666 #>>43609677 #
wongarsu ◴[] No.43609221[source]
Most hardware companies are just terrible at software in general. Camera makers are pretty average in that regard.

Usability of the camera hardware and software ecosystem is another matter. I think the common wisdom is that most paying users don't want beginner-friendly, they want powerful and familiar. So everything emulates the paradigms of what came before. DSLRs try to provide an interface that would be familiar to someone used to a 50 year old SLR camera, and Lightroom tries to emulate a physical darkroom. Being somewhat hostile to the uninitiated might even be seen as a feature.

replies(2): >>43609691 #>>43610405 #
kookamamie ◴[] No.43609691[source]
Yes, fully agreed. However, the way the companies currently approach this - catering for the ever-reducing niche, will end up killing the DLSRs over time. They just don't offer enough over phones, and the UX/SW being so crappy alienates the potential new userbase completely.
replies(1): >>43610096 #
throwanem ◴[] No.43610096[source]
> They just don't offer enough over phones

You can achieve maybe a quarter of the kinds of shots on a phone that an interchangeable-lens camera will let you make.

That's an extremely important quarter! For most people it covers everything they ever want a camera to do. But if you want to get into the other 75%, you're never going to be able to do it with the enormous constraints imposed by a phone camera's strict optical limits, arising from the tight physical constraints into which that camera has to fit.

replies(1): >>43613220 #
genewitch ◴[] No.43613220[source]
I had two phones with 108MP sensors and while you can zoom in on the resulting image the details are suggestions rather than what I would consider pixels.

Whereas a $1500 Nikon 15MP from 20 years ago is real crisp, and I can put a 300mm lens on it if I want to "zoom in".

Even my old nikon 1 v1 with its cropped sensor 12MP takes "better pictures" than the two 108MP phone cameras.

But there are uses for the pixel density and I enjoyed having 108MP for certain shots, otherwise not using that mode in general.

replies(1): >>43613657 #
throwanem ◴[] No.43613657[source]
Yeah, that's the exact tradeoff. 108MP (or even whatever the real photosite count is that they're shift-capturing or otherwise trick-shooting to get that number) on a sensor that small is genuinely revolutionary. But only giving that sensor as much light to work with as a matchhead-sized lens can capture for it, there's no way to avoid relying very heavily on the ISP to yield an intelligible image. Again, that does an incredible job for what little it's given to work with - but doing so requires it be what we could fairly call "inventive," with the result that anywhere near 100% zoom, "suggestions" are exactly what you're seeing. The detail is as much computational as "real."

People make much of whatever Samsung it was a couple years back, that got caught copy-pasting a sharper image of Luna into that one shot everyone takes and then gets disappointed with the result because, unlike the real thing, our brain doesn't make the moon seem bigger in pictures. But they all do this and they have for years. I tried taking pictures of some Polistes exclamans wasps with my phone a couple years back, in good bright lighting with a decent CRI (my kitchen, they were houseguests). Now if you image search that species name, you'll see these wasps are quite colorful, with complex markings in shades ranging from bright yellow through orange, "ferruginous" rust-red, and black.

In the light I had in the kitchen, I could see all these colors clearly with my eyes, through the glass of the heated terrarium that was serving as the wasps' temporary enclosure. (They'd shown a distinct propensity for the HVAC registers, and while I find their company congenial, having a dozen fertile females exploring the ductwork might have been a bit much even for me...) But as far as I could get the cameras on this iPhone 13 mini to report, from as close as their shitty minimum working distance allows, these wasps were all solid yellow from the flat of their heart-shaped faces to the tip of their pointy butts. No matter what I did, even pulling a shot into Photoshop to sample pixels and experimentally oversaturate, I couldn't squeeze more than a hint of red out of anything without resorting to hue adjustments, i.e. there is no red there to find.

So all I can conclude is the frigging thing made up a wasp - oh, not in the computer vision, generative AI sense we would mean that now, or even in the Samsung sense that only works for the one subject anyway, but in the sense that even in the most favorable of real-world conditions, it's working from such a total approximation of the actual scene that, unless that scene corresponds closely enough to what the ISP's pipeline was "trained on" by the engineers who design phones' imaging subsystems, the poor hapless thing really can't help but screw it up.

This is why people who complain about discrete cameras' lack of brains are wrongheaded to do so. I see how they get there, but there are some aspects of physics that really can't be replaced by computation, including basically all the ones that matter, and the physical, optical singlemindedness of the discrete camera's sole design focus is what liberates it to excel in that realm. Just as with humans, all cramming a phone in there will do is give the poor thing anxiety.

replies(1): >>43614204 #
genewitch ◴[] No.43614204[source]
I generally judge a camera by how accurately it can capture sunset, relative to what i actually see. on a samsung galaxy note 20, i can mess with the white balance a bit to get it "pretty close", but tends to clamp color values so the colors are more uniform than they are in real life. I've seen orange dreamsicle, strawberry sherbet, lavender, at the same time, at different intensities in the same section of sky. No phone camera seems to be able to capture that. http://projectftm.com/#noo2qor_GgyU1ofgr0B4jA captured last month. it wasn't so "pastel", it was much more rich. The lightening at the "horizon" is also common with phone cameras, and has been since the iphone 4 and Nexus series of phones. It looks awful and i don't get why people put up with it.
replies(1): >>43623300 #
throwanem ◴[] No.43623300[source]
I think we see, or more properly perceive although weakly, some higher-order color harmonics that cameras don't capture and displays don't (intentionally) reproduce, and I think the pinky-magenta-purplish region of the gamut might be the easiest place to notice the difference.

I think people mostly put up with it because on the one hand it doesn't matter all that often (sunset is a classic worst-case test for imaging systems!) and, on the other, well, "who are you going to believe? Fifty zillion person-centuries of image engineering and more billions of phones than there are living humans, or your own lyin' eyes?"

replies(2): >>43626296 #>>43633036 #
1. genewitch ◴[] No.43626296[source]
i've wanted a de-bayered sensor camera for a decade and a half; but i'm not willing to pay Red or Arri prices for a real monochrome cine camera. I had an Huawei Honor 8 that had a real-honest-to-goodness monochrome sensor on it. It used it for focusing, but one could take images straight from that sensor. It was around the time that Asus zenfone was using IR Lasers to do focusing, other phones had other depth sensors.

I still have to manually focus (by pushing the screen where i want it to focus), but on newer phones the focus tries to "track" what you touched, which is... why would they change that? I tilt the phone down to interact with it, i know where in the frame i want it to focus, because before i tilted the phone down, i was looking at the frame! Rule of thirds, i can reframe the image to put focus exactly in one of the areas it ought be, zoom in or out, whatever. But no, apparently it has been decided i want the focus to wander around as it sees fit.

I just unplugged the honor 8 to take a picture and apparently the battery is kaput since the last time i used it. Sad day, indeed.