←back to thread

399 points nomdep | 3 comments | | HN request time: 0.755s | source
Show context
socalgal2 ◴[] No.44296080[source]
> Another common argument I've heard is that Generative AI is helpful when you need to write code in a language or technology you are not familiar with. To me this also makes little sense.

I'm not sure I get this one. When I'm learning new tech I almost always have questions. I used to google them. If I couldn't find an answer I might try posting on stack overflow. Sometimes as I'm typing the question their search would finally kick in and find the answer (similar questions). Other times I'd post the question, if it didn't get closed, maybe I'd get an answer a few hours or days later.

Now I just ask ChatGPT or Gemini and more often than not it gives me the answer. That alone and nothing else (agent modes, AI editing or generating files) is enough to increase my output. I get answers 10x faster than I used to. I'm not sure what that has to do with the point about learning. Getting answers to those question is learning, regardless of where the answer comes from.

replies(13): >>44296120 #>>44296159 #>>44296324 #>>44296351 #>>44296416 #>>44296810 #>>44296818 #>>44297019 #>>44297098 #>>44298720 #>>44299945 #>>44300631 #>>44301438 #
socalgal2 ◴[] No.44296120[source]
To add, another experience I had. I was using an API I'm not that familiar with. My program was crashing. Looking at the stack trace I didn't see why. Maybe if I had many months experience with this API it would be obvious but it certainly wasn't to me. For fun I just copy and pasted the stack trace into Gemini. ~60 frames worth of C++. It immediately pointed out the likely cause given the API I was using. I fixed the bug with a 2 line change once I had that clue from the AI. That seems pretty useful to me. I'm not sure how long it would have taken me to find it otherwise since, as I said, I'm not that familiar with that API.
replies(2): >>44296616 #>>44329586 #
nottorp ◴[] No.44296616[source]
You remember when Google used to do the same thing for you way before "AI"?

Okay, maybe sometimes the post about the stack trace was in Chinese, but a plain search used to be capable of giving the same answer as a LLM.

It's not that LLMs are better, it's search that got entshittified.

replies(7): >>44296642 #>>44296691 #>>44296762 #>>44296763 #>>44297244 #>>44297283 #>>44298113 #
nsonha ◴[] No.44297244[source]
> when Google used to do the same thing for you way before "AI"?

Which is never? Do you often just lie to win arguments? LLM gives you a synthesized answer, search engine only returns what already exists. By definition it can not give you anything that is not a super obvious match

replies(1): >>44297384 #
nottorp ◴[] No.44297384[source]
> Which is never?

In my experience it was "a lot". Because my stack traces were mostly hardware related problems on arm linux in that period.

But I suppose your stack traces were much different and superior and no one can have stack traces that are different from yours. The world is composed of just you and your project.

> Do you often just lie to win arguments?

I do not enjoy being accused of lying by someone stuck in their own bubble.

When you said "Which is never" did you lie consciously or subconsciously btw?

replies(2): >>44299629 #>>44305949 #
1. SpaceNugget ◴[] No.44299629[source]
According to a quick search on google, which is not very useful these days, the maximum query length is 32 words or 2000 characters and change depending on which answer you trust.

Whatever it is specifically, the idea that you could just paste a 600 line stack trace unmodified into google, especially "way before AI" and get pointed to the relevant bit for your exact problem is obviously untrue.

replies(1): >>44307490 #
2. nottorp ◴[] No.44307490[source]
"is" not "was" though.

Pasting stack traces and kernel oopses hasn't worked in quite a while, I think. It's very possible that the maximum query was longer in the past.

2000 characters is also more than a double spaced manuscript page as defined by the book industry (which seems to be about 1500). You can fit the top of a stack trace in there. And if you're dealing with talking to hardware, the top can be enough.

replies(1): >>44310651 #
3. SpaceNugget ◴[] No.44310651[source]
I'm really struggling to see why you might assume that google decreases the maximum query length over time when that's generally the exact opposite of how things develop?

And indeed, in the early days the maximum query length was 10 words. So no, you have never been able to paste an entire stack trace into google and magically get a concise summary.

If you are changing the original claim that you were responding to to "I can do my job without llms if I have google search" Sure of course anyone can. But you can't use that to dismiss that some people find it quite convenient to just dump the entire stack trace into a text chat and have a decent summary of what is important without having to read a single part of it.