←back to thread

152 points GavinAnderegg | 6 comments | | HN request time: 0.207s | source | bottom
1. cmrdporcupine ◴[] No.44455776[source]
I get a lot of value out of Claude Max at $100 USD/month. I use it almost exclusively for my personal open source projects. For work, I'm more cautious.

I worry, with an article like this floating around, and with this as the competition, and with the economics of all this stuff generally... major price increases are on the horizon.

Businesses (some) can afford this, after all it's still just a portion of the costs of a SWE salary (tho $1000/m is getting up there). But open source developers cannot.

I worry about this trend, and when the other shoe will drop on Anthropic's products, at least.

replies(4): >>44455817 #>>44456022 #>>44456364 #>>44456374 #
2. mring33621 ◴[] No.44455817[source]
Those market forces will push the thriftier devs to find better ways to use the lesser models. And they will probably share their improvements!

I'm very bullish on the future of smaller, locally-run models, myself.

replies(1): >>44455844 #
3. cmrdporcupine ◴[] No.44455844[source]
I have not invested time on locally-run, I'm curious if they could even get close to approaching the value of Sonnet4 or Opus.

That said, I suspect a lot of the value in Claude Code is hand-rolled fined-tuned heuristics built into the tool itself, not coming from the LLM. It does a lot of management of TODO lists, backtracking through failed paths, etc which look more like old-school symbolic AI than something the LLM is doing on its own.

Replicating that will also be required.

4. csomar ◴[] No.44456022[source]
If it weren't for the Chinese, the prices would have been x10.
5. barrkel ◴[] No.44456364[source]
Where do you see the major price increases coming from?

The underlying inference is not super expensive. All the tricks they're pulling to make it smarter certainly multiply the price, but the price being charged almost certainly covers the cost. Basic inference on tuned base models is extremely cheap. But certainly it looks like Anthropic > OpenAI > Google in terms of inference cost structure.

Prices will only come up if there's a profit opportunity; if one of the vendors has a clear edge and gains substantial pricing power. I don't think that's clear at this point. This article is already equivocating between o3 and Opus.

6. stpedgwdgfhgdd ◴[] No.44456374[source]
Just a matter of time before AI coding becomes commodity and prices drop. 2027