←back to thread

142 points markisus | 2 comments | | HN request time: 0.673s | source

LiveSplat is a system for turning RGBD camera streams into Gaussian splat scenes in real-time. The system works by passing all the RGBD frames into a feed forward neural net that outputs the current scene as Gaussian splats. These splats are then rendered in real-time. I've put together a demo video at the link above.
1. whywhywhywhy ◴[] No.43995751[source]
Would be good to see how it's different from just the depth channel applied to the Z of the RGB pixels. Because it looks very similar to that.
replies(1): >>43995841 #
2. markisus ◴[] No.43995841[source]
The application has this feature and lets you switch back and forth. What you are talking about is the standard pointcloud rendering algorithm. I have an older video where I display the corresponding pointcloud [1] in a small picture in picture frame so you can compare.

I actually started with pointclouds for my VR teleoperation system but I hated how ugly it looked. You end up seeing through objects and objects becoming unparseable if you get too close. Textures present in the RGB frame also become very hard to make out because everything becomes "pointilized". In the linked video you can make out the wood grain direction in the splat rendering, but not in the pointcloud rendering.

[1] https://youtu.be/-u-e8YTt8R8?si=qBjYlvdOsUwAl5_r&t=14