←back to thread

1480 points sandslash | 2 comments | | HN request time: 0.413s | source
Show context
darqis ◴[] No.44317373[source]
when I started coding at the age of 11 in machine code and assembly on the C64, the dream was to create software that creates software. Nowadays it's almost reality, almost because the devil is always in the details. When you're used to write code, writing code is relatively fast. You need this knowledge to debug issues with generated code. However you're now telling AI to fix the bugs in the generated code. I see it kind of like machine code becomes overlaid with asm which becomes overlaid with C or whatever higher level language, which then uses dogma/methodology like MVC and such and on top of that there's now the AI input and generation layer. But it's not widely available. Affording more than 1 computer is a luxury. Many households are even struggling to get by. When you see those what 5 7 Mac Minis, which normal average Joe can afford that or does even have to knowledge to construct an LLM at home? I don't. This is a toy for rich people. Just like with public clouds like AWS, GCP I left out, because the cost is too high and running my own is also too expensive and there are cheaper alternatives that not only cost less but also have way less overhead.

What would be interesting to see is what those kids produced with their vibe coding.

replies(5): >>44317396 #>>44317699 #>>44318049 #>>44319693 #>>44321408 #
infecto ◴[] No.44318049[source]
This is most definitely not toys for rich people. Now perhaps depending on your country it may be considered rich but I would comfortably say that for most of the developed world, the costs for these tools are absolutely attainable, there is a reason ChatGPT has such a large subscriber base.

Also the disconnect for me here is I think back on the cost of electronics, prices for the level of compute have generally gone down significantly over time. The c64 launched around the $5-600 price level, not adjusted for inflation. You can go and buy a Mac mini for that price today.

replies(2): >>44318720 #>>44320337 #
1. bawana ◴[] No.44318720[source]
I suspect that economies of scale are different for software and hardware. With hardware, iteration results in optimization of the supply chain, volume discount as the marginal cost is so much less than the fixed cost, and lower prices in time. The purpose of the device remains fixed. With software, the software becomes ever more complex with technical debt - featuritis, patches, bugs, vulnerabilities, and evolution of purpose to try and capture more disparate functions under one environment in an attempt to capture and lock in users. Price tends to increase in time. (This trajectory incidentally is the opposite of the unix philosophy - having multiple small fast independent tools than can be concatenated to achieve a purpose.) This results in ever increasing profits for software and decreasing profits for hardware at equilibrium. In the development of AI we are already seeing this-first we had gpt, then chatbots, then agents, now integration with existing software architectures.Not only is each model ever larger and more complex (RNN->transformer->multihead-> add fine tuning/LoRA-> add MCP), but the bean counters will find ways to make you pay for each added feature. And bugs will multiply. Already prompt injection attacks are a concern so now another layer is needed to mitigate those.

For the general public, these increasing costs will besubsidized by advertising. I cant wait for ads to start appearring in chatGPT- it will be very insidious as the advertising will be comingled with the output so there will be no way to avoid it.

replies(1): >>44326906 #
2. infecto ◴[] No.44326906[source]
I’m struggling to follow your argument, it feels more speculative than evidence-based. Runtime costs have consistently fallen.

As for advertising, it’s possible, but with so many competitors and few defensible moats, there’s real pressure to stay ad-free. These tools are also positioned to command pricing power in a way search never was, given search has been free for decades.

The hardware vs. software angle seems like a distraction. My original point was in response to the claim that LLMs are “toys for the rich.” The C64 was a rich kid’s toy too—and far less capable.