←back to thread

196 points zmccormick7 | 1 comments | | HN request time: 0.238s | source
1. heyrhett ◴[] No.45393215[source]
If context is the bottleneck, MCP is dead.

MCP can use 10k tokens. Everything good happens in the first 100k tokens.

It's more context efficient to code a custom binary and prompt the LLM how to use the binary when needed.