Most active commenters

    ←back to thread

    395 points pseudolus | 15 comments | | HN request time: 0.565s | source | bottom
    1. ilrwbwrkhv ◴[] No.43633555[source]
    AI bubble seems close to collapsing. God knows how many billions have been invested and we still don't have an actual use case for AI which is good for humanity.
    replies(4): >>43633642 #>>43633992 #>>43634036 #>>43640570 #
    2. boredemployee ◴[] No.43633642[source]
    I think I understand what you're trying to say.

    We certainly improve productivity, but that is not necessarily good for humanity. Could be even worse.

    i.e.: my company already expect less time for some tasks given that they _know_ I'll probably use some AI to do tasks. Which means I can humanly handle more context in a given week if the metric is "labour", but you end up with your brain completely melted.

    replies(2): >>43633701 #>>43633863 #
    3. bluefirebrand ◴[] No.43633701[source]
    > We certainly improve productivity

    I think this is really still up for debate

    We produce more output certainly but if it's overall lower quality than previous output is that really "improved productivity"?

    There has to be a tipping point somewhere, where faster output of low quality work is actually decreasing productivity due to the efforts now required to keep the tower of garbage from toppling

    replies(1): >>43634003 #
    4. DickingAround ◴[] No.43633863[source]
    I think the core of the 'improved productivity' question will be ultimately impossible to answer. We would want to know if productivity was improved over the lifetime of a society; perhaps hundreds of years. We will have no clear A/B test from which to draw causal relationships.
    replies(1): >>43634040 #
    5. amiantos ◴[] No.43633992[source]
    Your statement appears to be composed almost entirely of vague and ambiguous statements.

    "AI bubble seems close to collapsing" in response to an article about AI being used as a study aid. Does not seem relevant to the actual content of the post at all, and you do not provide any proof or explanation for this statement.

    "God knows how many billions have been invested", I am pretty sure it's actually not that difficult to figure out how much investor money has been poured into AI, and this still seems totally irrelevant to a blog post about AI being used as a study aid. Humans 'pour' billions of dollars into all sorts of things, some of which don't work out. What's the suggestion here, that all the money was wasted? Do you have evidence of that?

    "We still don't have an actual use case for AI which is good for humanity"... What? We have a lot of use cases for AI, some of which are good for humanity. Like, perhaps, as a study aid.

    Are you just typing random sentences into the HN comment box every time you are triggered by the mention of AI? Your post is nonsense.

    6. fourseventy ◴[] No.43634003{3}[source]
    It's not up for debate. Ask any programmer if LLMs improve productivity and the answer is 100% yes.
    replies(3): >>43634049 #>>43634190 #>>43643260 #
    7. papichulo2023 ◴[] No.43634036[source]
    It is helping me do that projects that would otherwise take me hours in just a few minutes, soooo, shrug.
    replies(1): >>43634586 #
    8. AlexandrB ◴[] No.43634040{3}[source]
    This is exactly right. It also depends on how all the AGI promises shake out. If AGI really does emerge soon, it might not matter anymore whether students have any foundational knowledge. On the other hand, if you still need people to know stuff in the future, we might be creating a generation of citizens incapable of doing the job. That could be catastrophic in the long term.
    9. AlexandrB ◴[] No.43634049{4}[source]
    Meanwhile in this article/thread you have a bunch of programmers complaining that LLMs don't improve overall productivity: https://news.ycombinator.com/item?id=43633288
    10. bluefirebrand ◴[] No.43634190{4}[source]
    I am a programmer and my opinion is that all of the AI tooling my company is making me use gets in the way about as often as it helps. It's probably overall a net negative, because any code it produces for me takes longer for me to review and ensure correctness as it would to just write it

    Does my opinion count?

    11. user432678 ◴[] No.43634586[source]
    What kind of projects are those? I am genuinely curious. I was excited by AI, Claude specifically, since I am an avid procrastinator and would love to finish tens of projects I have in mind. Most of those projects are games with specifical constraints. I got disenchanted pretty quickly when started actually using AI to help with different parts of the game programming. Majority of problems I had are related to poor understanding of generated code. I mean yes, I read the code, fixed minor issues, but it always feels like I don’t really internalised the parts of the game which slows me down quite significantly in a long run, when I need to plan major changes. Probably a skill issue, but for now the only thing AI is helpful for me is populating Jira descriptions for my “big picture refactoring” work. That’s basically it.
    replies(2): >>43634689 #>>43640298 #
    12. noman-land ◴[] No.43634689{3}[source]
    I was able to use llama.cpp and whisper.cpp to help me build a transcription site for my favorite podcast[0]. I'm a total python noob and hadn't really used sqlite before, or really used AI before but using these tools, completely offline, llama.cpp helped me write a bunch of python and sql to get the job done. It was incredibly fun and rewarding and most importantly, it got rid of the dread of not knowing.

    0 - https://transcript.fish

    13. protocolture ◴[] No.43640298{3}[source]
    AI is really good at coming up with solutions to already solved problems. Which if you look at the Unity store, is something in incredibly high demand.

    This frees you up to work on the crunchy unsolved problems.

    14. azemetre ◴[] No.43640570[source]
    We must create God in order to enslave it and force it to summarize our emails.
    15. globnomulous ◴[] No.43643260{4}[source]
    > It's not up for debate. Ask any programmer if LLMs improve productivity and the answer is 100% yes.

    Programmer here. The answer is 100% no. The programmers who think they're saving time are racking up debts they'll pay later.

    The debts will come due when they find they've learned nothing about a problem space and failed to become experts in it despite having "written" and despite owning the feature dealing with it.

    Or they'll come due as their failure to hone their skills in technical problem solving catches up to them.

    Or they'll come due when they have to fix a bug that the LLM produced and either they'll have no idea how or they'll manage to fix it but then they'll have to explain, to a manager or customer, that they committed code to the codebase that they didn't understand.