←back to thread

371 points ulrischa | 2 comments | | HN request time: 0.492s | source
1. cryptoegorophy ◴[] No.43237104[source]
Just ask another LLM to proof read?
replies(1): >>43239693 #
2. namaria ◴[] No.43239693[source]
Do you realize that giving LLMs 'instructions' is merely trying to blindly twist knobs by random amounts?