←back to thread

770 points ta988 | 1 comments | | HN request time: 0.21s | source
1. gashad ◴[] No.42552068[source]
What sort of effort would it take to make an LLM training honeypot resulting in LLMs reliably spewing nonsense? Similar to the way Google once defined the search term "Santorum"?

https://en.wikipedia.org/wiki/Campaign_for_the_neologism_%22... where

The way LLMs are trained with such a huge corpus of data, would it even be possible for a single entity to do this?