←back to thread

263 points itzlambda | 1 comments | | HN request time: 0s | source
Show context
jameslk ◴[] No.44608826[source]
Maybe I’ve been missing some important stuff, but it seems the most relevant and important updates eventually just bubble up to the front page of HN or get mentioned in the comments
replies(3): >>44608862 #>>44608867 #>>44609245 #
SoftTalker ◴[] No.44608867[source]
Trying to keep up is like jumping on a 90mph treadmill. I have decided to opt out. I think AI (and currently LLMs) is more than a fad and not going away but it's in a huge state of churn right now. I'm not investing a ton of time into anything until I have to. In another few years the landscape will hopefully be more clear. Or not, but at least I won't have spent a lot of time on stuff that has quickly become irrelevant.

I'm currently not using AI or LLMs in any of my day-to-day work.

replies(4): >>44609014 #>>44609019 #>>44609147 #>>44609741 #
HellDunkel ◴[] No.44609014[source]
This. When has early adoptation paid off lately? Remember prompt engineering?
replies(2): >>44609294 #>>44610439 #
1. twelve40 ◴[] No.44610439{3}[source]
what do you mean remember? it didn't go anywhere. I try to understand how to make this useful for my daily programming, and every credible-looking advice begins with "tell LLM to program in style ABC and avoid antipatterns like XYZ", sometimes pages and pages long. It seems like without this prompt sourcery you cannot produce good code using an LLM it will make the same stupid mistakes over and over unless you try to pre-empt them with a carefully engineered upfront prompt. Aside from stupid "influencers" who bullshit that they produced a live commercial app with a one-liner English sentence, it seems that getting anything useful really requires a lot of prompt work, whatever you want to call it.