←back to thread

323 points timbilt | 1 comments | | HN request time: 0.212s | source
Show context
wcfrobert ◴[] No.42131165[source]
Lots of interesting debates in this thread. I think it is worth placing writing/coding tasks into two buckets. Are you producing? Or are you learning?

For example, I have zero qualms about relying on AI at work to write progress reports and code up some scripts. I know I can do it myself but why would I? I spent many years in college learning to read and write and code. AI makes me at least 2x more efficient at my job. It seems irrational not to use it. Like a farmer who tills his land by hand rather than relying on a tractor because it builds character or something. But there is something to be said about atrophy. If you don't use it, you lose it. I wonder if my coding skill will deteriorate in the years to come...

On the other hand, if you are a student trying to learn something new, relying on AI requires walking a fine line. You don't want to over-rely on AI because a certain degree of "productive struggle" is essential for learning something deeply. At the same time, if you under-rely on AI, you drastically decrease the rate at which you can learn new things.

In the old days, people were fit because of physical labor. Now people are fit because they go to the gym. I wonder if there will be an analog for intellectual work. Will people be going to "mental" gyms in the future?

replies(9): >>42131209 #>>42131502 #>>42131788 #>>42132365 #>>42133145 #>>42133517 #>>42133877 #>>42134499 #>>42136622 #
mav3ri3k ◴[] No.42132365[source]
A current 3rd year college student here. I really want LLMs to help me in learning but the success rate is 0.

They often can not generate relatively trivial code When they do, they can not explain that code. For example, I was trying to learn socket programing in C. Claude generated the code, but when I stared asking about stuff, it regressed hard. Also, often the code is more complex than it needs to be. When learning a topic, I want that topic, not the most common relevant code with all the spagheti used on github.

For other subjects, like dbms, computer network, when asking about concepts, you better double check, because they still make stuff up. I asked ChatGPT to solve prev year question for dbms, and it gave a long, answer which looked good on surface. But when I actually read through because I need to understand what it is doing, there were glaring flaws. When I point them out, it makes other mistakes.

So, LLMs struggle to generate concise to the point code. They can not explain that code. They regularly make stuff up. This is after trying Claude, ChatGPT and Gemini with their paid versions in various capacities.

My bottom line is, I should NEVER use a LLM to learn. There is no fine line here. I have tried again and again because tech bros keep preaching about sparks of AGI, making startup with 0 coding skills. They are either fools or genius.

LLMs are useful strictly if you already know what you are doing. That's when your productivity gains are achieved.

replies(4): >>42132578 #>>42132722 #>>42134012 #>>42134414 #
1. fragmede ◴[] No.42132578[source]
Care to share any of these chats?