There seems to be a convincing debunking thread on Twitter, but I definitely don't have the chops to evaluate either claim:
https://twitter.com/Ethan_smith_20/status/175306260429219874...
replies(3):
https://twitter.com/Ethan_smith_20/status/175306260429219874...
I think what would happen if this problem was fixed is that the VAE would produce less appealing more blurry images. This is a classic problem with VAEs. So, more mathematically correct, but less visually appealing.
With the appropriate GAN loss, you will instead get a plausible sharp image that differs more and more from the original the more you weigh the KL loss term. A classic GAN that samples from the normal distribution in fact has the best possible KL divergence loss and none of the blurriness from a VAE’s pixel based loss.