←back to thread

I'm absolutely right

(absolutelyright.lol)
648 points yoavfr | 1 comments | | HN request time: 0.447s | source
Show context
pflenker ◴[] No.45138538[source]
Gemini keeps telling me "you've hit a common frustration/issue/topic/..." so often it is actively pushing me away from using it. It either makes me feel stupid because I ask it a stupid question and it pretends - probably to not hurt my feelings - that everyone has the same problem, or it makes me feel stupid because I felt smart about asking my super duper edge case question no one else has probably ever asked before and it tells me that everyone is wondering the same thing. Either way I feel stupid.
replies(2): >>45138612 #>>45139336 #
blinding-streak ◴[] No.45138612[source]
I don't think that's Gemini's problem necessarily. You shouldn't be so insecure.
replies(3): >>45138891 #>>45139028 #>>45141070 #
pflenker ◴[] No.45138891[source]
Not only is that a weird presumption about my ostensible insecurities on your end, it's also weird that the state of my own mental resilience should play any role at all when interacting with a tool.

If all other things are equal and one LLM is consistently vaguely annoying, for whatever reason, and the other isn't, I chose the other one.

Leaving myself aside, LLMs are broadly available and strongly forced onto everyone for day-to-day use, including vulnerable and insecure groups. These groups should not adapt to the tool, the tool should adapt to the users.

replies(1): >>45139178 #
zahlman ◴[] No.45139178[source]
> Not only is that a weird presumption about my ostensible insecurities on your end

I'm not GP but I agree that it isn't universal, nor especially healthy or productive, to have the response you describe to being told that your issue is common. It would make sense if you could e.g. hear the insincerity in a person's tone of voice, but Gemini outputs text and the concept of sincerity is irrelevant to a computer program.

Focusing on the informational content seems to me like a good idea, so as to avoid https://en.wikipedia.org/wiki/ELIZA_effect.

> it's also weird that the state of my own mental resilience should play any role at all when interacting with a tool.

When I was a university student, my own mental resilience was absolutely instrumental to deciphering gcc error messages.

> LLMs are broadly available and strongly forced onto everyone for day-to-day use

They say this kind of thing about cars and smartphones, too. Somehow I endure.

replies(2): >>45139322 #>>45141081 #
jennyholzer ◴[] No.45141081[source]
[flagged]
replies(1): >>45147774 #