←back to thread

566 points PaulHoule | 1 comments | | HN request time: 0.301s | source
Show context
amelius ◴[] No.44490824[source]
Damn, that is fast. But it is faster than I can read, so hopefully they can use that speed and turn it into better quality of the output. Because otherwise, I honestly don't see the advantage, in practical terms, over existing LLMs. It's like having a TV with a 200Hz refresh rate, where 100Hz is just fine.
replies(2): >>44491035 #>>44491573 #
pmxi ◴[] No.44491035[source]
There are plenty of LLM use cases where the output isn’t meant to be read by a human at all. e.g:

parsing unstructured text into structured formats like JSON

translating between natural or programming languages

serving as a reasoning step in agentic systems

So even if it’s “too fast to read,” that speed can still be useful

replies(2): >>44491329 #>>44495081 #
1. amelius ◴[] No.44491329[source]
Sure, but I was talking about the chat interface, sorry if that was not clear.