←back to thread

454 points nathan-barry | 1 comments | | HN request time: 0.218s | source
Show context
thatguysaguy ◴[] No.45645680[source]
Back when BERT came out, everyone was trying to get it to generate text. These attempts generally didn't work, here's one for reference though: https://arxiv.org/abs/1902.04094

This doesn't have an explicit diffusion tie in, but Savinov et al. at DeepMind figured out that doing two steps at training time and randomizing the masking probability is enough to get it to work reasonably well.

replies(2): >>45648159 #>>45649173 #
1. binarymax ◴[] No.45648159[source]
Interesting as I was in the (very large) camp that never considered it for generation, and saw it as a pure encoder for things like semantic similarity with an easy jump to classification, etc