←back to thread

321 points distantprovince | 1 comments | | HN request time: 0s | source
Show context
zer00eyz ◴[] No.44617467[source]
This is an interesting take.

Cause all an LLM is, is a reflection of its input.

Garbage in garbage out.

If we're going to have this rule about AI, maybe we should have it about... everything. From your mom's last Facebook post, to what is said by influencers to this post...

Say less. Do more.

replies(3): >>44617499 #>>44617528 #>>44617550 #
1. z3c0 ◴[] No.44617528[source]
An LLM's output being a reflection of its output would imply determinism, which is the opposite of their value prop. "Garbage in, garbage out" is an addage born from traditional data pipelines. "Anything in, generic slop, possibly garbage, out" is the new status quo.