←back to thread

42 points skadamat | 5 comments | | HN request time: 1.242s | source
Show context
jus3sixty ◴[] No.45187107[source]
It's surprising to see the iPhone 17 Pro Max specs compared to those of the Galaxy S25 Ultra.

Comparisons show the S25 Ultra leading in several areas, especially the cameras. The difference in megapixel count is significant.

For years, Apple's flagships were considered superior, but Samsung appears to be pushing boundaries with the S25 Ultra.

Is Apple truly behind, or does their optimization and ecosystem integration make up for it?

replies(1): >>45187191 #
1. ghusto ◴[] No.45187191[source]
Their cameras have been terrible for a long time (comparatively speaking). I switched from a Pixel 6a to an iPhone 16 and was shocked at how bad my pictures are now. I get the feeling that iPhone users just don't know any better because they've always had an iPhone.
replies(1): >>45187494 #
2. DiggyJohnson ◴[] No.45187494[source]
Seems like it just has to do with what you're expecting from a smartphone camera. I feel like Google Pixel is doing even more photo edit magic than iPhone.
replies(1): >>45188718 #
3. hbn ◴[] No.45188718[source]
They're straight up using generative AI on zoomed photos now to turn details into AI slop artifacts

https://whatever.scalzi.com/2025/08/29/pictures-not-photos-w...

replies(1): >>45189889 #
4. dmayle ◴[] No.45189889{3}[source]
Which is literally what Apple announced in this video:

"and the 2x telephoto has an updated photonic engine, which now uses machine learning to capture the lifelike details of her hair and the vibrant color of her jacket"

"like the 2x telephoto, the 8x also utilizes the updated photonic engine, which integrates machine learning into even more parts of the image pipeline. we apply deep learning models for demosaicing"

replies(1): >>45192638 #
5. hbn ◴[] No.45192638{4}[source]
They've been using that terminology for like a decade. They take multiple photos and use ML to figure out how to layer them together into a final image where everything is adequately exposed, and applies denoising. Google has done the same thing on Pixels since they've existed.

That's very different from taking that final photo and then running it through generative AI to guess what objects are. Look at the images in that article. It made the stop sign into a perfectly circular shiny button. I've never seen artifacting like that on a photo before.