An eye-tracking technology company had a Unity application that provided a partial, physically inaccurate 3D parallax experience. We built a physically accurate virtual camera calculation system — a 6DOF SDK extension that creates a true "window into a virtual world" effect based on real-time head tracking.
The client provides an SDK that tracks head and eye position in 3D space in real time. Their existing "Spatial Parallax" Unity application provided only a partial 3D experience. Key gaps: no dynamic field-of-view dependent on observer distance, no use of display physical dimensions, and no compensation for observer position and orientation relative to the display. This prevented the immersive "window into a virtual world" experience the technology is capable of delivering.
A physically accurate virtual camera calculation system integrated into the client's existing SDK:
1. Observer-equivalent virtual camera — 6DOF calculation (position and look-at vector) based on real-time head tracking data, so the rendered scene responds exactly as if the display were a physical window.
2. Dynamic field-of-view — calculated based on the intersection of the observer's visual frustum with the screen plane, updating in real time as the user moves.
3. Frame buffer rectification — a transformation matrix that corrects the rendered image for the observer's actual position and orientation relative to the physical display.
4. SDK integration — delivered as a production extension of the client's existing SDK, not a standalone prototype.
5. Example application — extended the existing Spatial Parallax Unity app to demonstrate the Virtual View capability.
- Physically accurate 6DOF virtual camera from real-time head tracking
- Dynamic field-of-view based on observer distance and display dimensions
- Frame buffer rectification for observer position and orientation
- Real-time performance maintaining low-latency head tracking
- Delivered as a production SDK extension
- Unity example application demonstrating Virtual View
The SDK extension enables a physically accurate 3D rendering from the observer's actual position — creating the effect of looking through a real window into a virtual space. Delivered as a production-ready SDK extension, not a research prototype.