←back to thread

142 points markisus | 1 comments | | HN request time: 0.217s | source

LiveSplat is a system for turning RGBD camera streams into Gaussian splat scenes in real-time. The system works by passing all the RGBD frames into a feed forward neural net that outputs the current scene as Gaussian splats. These splats are then rendered in real-time. I've put together a demo video at the link above.
Show context
smusamashah ◴[] No.43998356[source]
The demo video does not show constructing 3d from input. Is it possible to do something like that with this? Take a continus feed of a static scene and keep improving the 3D view?

This is what I thought from the title, but the demo video is just a conitnuously changing stream of points/splats with the video.

replies(1): >>43998462 #
1. markisus ◴[] No.43998462[source]
If the scene is static, the normal Gaussian splatting pipeline will give much better results. You take a bunch of photos and then let the optimizer run for a while to create the scene.