←back to thread

646 points bradgessler | 1 comments | | HN request time: 0.221s | source
1. socalgal2 ◴[] No.44012205[source]
I'm clearly not Dustin Curtis. For me, so far, LLMs let me check my assumptions in a way that is why more effective than before which is to say, I didn't or rarely checked before. I'd have an opinion on a topic. I'd hold that opinion based on intuition/previous-experience/voodoo. Someone might challenge it. I'd probably mostly be shrug off their challenge. I'm not saying I'd dismiss it. I'd just not really look into it. Now I type something into ChatGPT/Gemini and it gives me back the pros and cons of the positions. It links to studies. Etc... I'm not saying I believe it point-blank but at least it gives me much more than I got before.