←back to thread

399 points nomdep | 1 comments | | HN request time: 0.213s | source
Show context
socalgal2 ◴[] No.44296080[source]
> Another common argument I've heard is that Generative AI is helpful when you need to write code in a language or technology you are not familiar with. To me this also makes little sense.

I'm not sure I get this one. When I'm learning new tech I almost always have questions. I used to google them. If I couldn't find an answer I might try posting on stack overflow. Sometimes as I'm typing the question their search would finally kick in and find the answer (similar questions). Other times I'd post the question, if it didn't get closed, maybe I'd get an answer a few hours or days later.

Now I just ask ChatGPT or Gemini and more often than not it gives me the answer. That alone and nothing else (agent modes, AI editing or generating files) is enough to increase my output. I get answers 10x faster than I used to. I'm not sure what that has to do with the point about learning. Getting answers to those question is learning, regardless of where the answer comes from.

replies(13): >>44296120 #>>44296159 #>>44296324 #>>44296351 #>>44296416 #>>44296810 #>>44296818 #>>44297019 #>>44297098 #>>44298720 #>>44299945 #>>44300631 #>>44301438 #
plasticeagle ◴[] No.44296416[source]
ChatGPT and Gemini literally only know the answer because they read StackOverflow. Stack Overflow only exists because they have visitors.

What do you think will happen when everyone is using the AI tools to answer their questions? We'll be back in the world of Encyclopedias, in which central authorities spent large amounts of money manually collecting information and publishing it. And then they spent a good amount of time finding ways to sell that information to us, which was only fair because they spent all that time collating it. The internet pretty much destroyed that business model, and in some sense the AI "revolution" is trying to bring it back.

Also, he's specifically talking about having a coding tool write the code for you, he's not talking about using an AI tool to answer a question, so that you can go ahead and write the code yourself. These are different things, and he is treating them differently.

replies(8): >>44296713 #>>44296870 #>>44297074 #>>44299662 #>>44300158 #>>44300604 #>>44300688 #>>44301747 #
olmo23 ◴[] No.44296870[source]
Where does the knowledge come from? People can only post to SO if they've read the code or the documentation. I don't see why LLMs couldn't do that.
replies(1): >>44297034 #
nobunaga ◴[] No.44297034[source]
ITT: People who think LLMs are AGI and can produce output that the LLM has come up with out of thin air or by doing research. Go speak with someone who is actually an expert in this field how LLMs work and why the training data is so important. Im amazed that people in the CS industry seem to talk like they know everything about a tech after using it but never even writing a line of code for an LLM. Our indsutry is doomed with people like this.
replies(3): >>44297145 #>>44297677 #>>44298679 #
raincole ◴[] No.44298679[source]
> LLM has come up with out of thin air

People don't think that. Especially not the commentor you replied to. You're human-hallucinating.

People think LLM are trained on raw documents and code besides StackOverflow. Which is very likely true.

replies(1): >>44298899 #
1. nobunaga ◴[] No.44298899[source]
Read the parent of the parent. Its about being able to answer questions thats not in its training data. People are talking about LLMs making scientific discoveries that humans havent. A ridiculous take. Its not possible and with the current state of tech never will be. I know what LLMs are trained on. Thats not the topic of conversation.