←back to thread

164 points thunderbong | 1 comments | | HN request time: 0.202s | source
Show context
richdougherty ◴[] No.41855210[source]
Kudos to the dev for coming up with the eye position fixing solution.

Building further on this idea, I wonder if instead of changing the image to look at the camera, we could change the "camera" to be where we're looking.

In other words we could simulate a virtual camera somewhere in the screen, perhaps over the eyes of the person talking.

We could simulate a virtual camera by using the image of the real camera (or cameras), constructing a 3D image of ourselves and re-rendering it from the virtual camera location.

I think this would be really cool. It would be like there was a camera in the centre of our screen. We could stop worrying about looking at the camera and look at the person talking.

Of course this is all very tricky, but does feel possible right now. I think the Apple Vision Pro might do something similar already?

replies(3): >>41855549 #>>41856548 #>>41856845 #
1. newaccount74 ◴[] No.41856845[source]
There is already a lot of research on the 3D reconstruction and camera movement part, for example this SIGGRAPH 2023 paper: https://research.nvidia.com/labs/nxp/lp3d/

In order for this to work for gaze correction, you'd probably need to take into consideration the location of the camera relative to the location of the eyes of the person on the screen, and then correct for how the other person is holding the phone, and it would probably only work for one-on-one calls. Probably need to know the geometry of the phone (camera parameters, screen size, position of camera relative to phone)

Would be amazing, not sure how realistic it is.