←back to thread

2127 points bakugo | 4 comments | | HN request time: 0s | source
Show context
simonw ◴[] No.43175433[source]
I got this working with my LLM tool (new plugin version: llm-anthropic 0.14) and figured out a bunch of things about the model in the process. My detailed notes are here: https://simonwillison.net/2025/Feb/25/llm-anthropic-014/

One of the most exciting new capabilities is that this model has a 120,000 token output limit - up from just 8,000 for the previous Claude 3.5 Sonnet model and way higher than any other model in the space.

It seems to be able to use that output limit effectively. Here's my longest result so far, though it did take 27 minutes to finish! https://gist.github.com/simonw/854474b050b630144beebf06ec4a2...

replies(3): >>43175527 #>>43175552 #>>43183287 #
Citizen_Lame ◴[] No.43175527[source]
How much did it cost?
replies(1): >>43175562 #
1. mrbonner ◴[] No.43175562[source]
$1.8
replies(1): >>43176004 #
2. rvnx ◴[] No.43176004[source]
I have a very long request to do like this, did you use a specific CLI tool ? (Thank you in advance)
replies(1): >>43176599 #
3. simonw ◴[] No.43176599[source]
I used my own CLI tool LLM, which can handle these long requests in streaming mode (Anthropic won't let you do a non-streaming request for long output replies like this).

  uv tool install llm
  llm install llm-anthropic
  llm keys set anthropic
  # paste in API key
  llm -m claude-3.7-sonnet -o thinking 1 'your prompt goes here'
replies(1): >>43177581 #
4. rvnx ◴[] No.43177581{3}[source]
Thank you very much