←back to thread

454 points nathan-barry | 3 comments | | HN request time: 0s | source
Show context
thatguysaguy ◴[] No.45645680[source]
Back when BERT came out, everyone was trying to get it to generate text. These attempts generally didn't work, here's one for reference though: https://arxiv.org/abs/1902.04094

This doesn't have an explicit diffusion tie in, but Savinov et al. at DeepMind figured out that doing two steps at training time and randomizing the masking probability is enough to get it to work reasonably well.

replies(2): >>45648159 #>>45649173 #
1. thatjoeoverthr ◴[] No.45649173[source]
Im just learning this from your text, after spending last week trying to get a BERT model to talk.

https://joecooper.me/blog/crosstalk/

I’ve still got a few ideas to try though so I’m not done having fun with it.

replies(1): >>45655874 #
2. Anon84 ◴[] No.45655874[source]
The trick is to always put the [MASK] at the end:

"The [MASK]" "The quick [MASK]" etc

replies(1): >>45668581 #
3. thatjoeoverthr ◴[] No.45668581[source]
I've saved this and I'll study this when I come back to it. Thanks!