←back to thread

Gemini 2.5 Flash Image

(developers.googleblog.com)
1092 points meetpateltech | 1 comments | | HN request time: 0.298s | source
Show context
carlosbaraza ◴[] No.45032574[source]
Unfortunately, it suffers from the same safetyism than other many releases. Half of the prompts get rejected. How can you have character consistency if the model is forbidden from editing any human. And most of my photo editing involves humans, so basically this is just a useless product. I get that Google doesn't want to be responsible for deep fake advances, but that seems inevitable, so this is just slightly delaying progress. Eventually we will have to face it and allow for society to adapt.

This trend of tools that point a finger at you and set guardrails is quite frustrating. We might need a new OSS movement to regain our freedom.

replies(4): >>45032782 #>>45032955 #>>45034758 #>>45035312 #
Workaccount2 ◴[] No.45032955[source]
I have an old photo of my girlfriend with her cousin when they were young, wearing Christmas dresses in front of the tree, not long before they were separated to other sides of the world for decades now. The photo is itself low quality on top of the photo itself being physically beat up.

So far no model is willing to clean it up :/

replies(4): >>45033699 #>>45035531 #>>45036718 #>>45041194 #
1. boppo1 ◴[] No.45041194[source]
If you are not personally offended by looking at CRAZY pornography, you could start digging into the comfyui ecosystem. It's not all porn, there are lots of pro photo-manipulators doing sfw stuff, but the community overlap with NSFW is basically borderless, so you'll probably bump into it.

However, the results the comfyui people get are lightyears ahead of any oneshot-prompt model. Either you can find someone to do cleanup for you (should be trivial, I wouldn't pay more than $10-15) or if you have good specs for inference you could learn to do it yourself.