Asynchronous reprojection

From HandWiki
Short description: Virtual reality technology

Asynchronous reprojection is a class of computer graphics technologies aimed ensuring a virtual reality headset's responsiveness to user motion even when the GPU isn't able to keep up with the headset's target frame rate.[1] Reprojection involves the headset's driver taking one or multiple previously rendered frames and using newer motion information from the headset's sensors to extrapolate (often referred to as "reprojecting" or "warping") the previous frame into a prediction of what a normally rendered frame would look like.[2] "Asynchronous" refers to this process being continuously performed in parallel with rendering, allowing reprojected frames to be displayed without delay in case a regular frame isn't rendered in time.[2]

The use of these techniques allows for a lowering in the video rendering hardware specifications required to achieve a certain intended level of responsiveness.[3]

Variations

Various vendors have implemented their own variations of the technique under different names. Basic versions of the technique are referred to as asynchronous reprojection by Google and Valve,[1][4] while Oculus has two implementations, called asynchronous timewarp[2] and asynchronous spacewarp. Asynchronous timewarp uses the headset's rotational data to extrapolate a new rendered frame based on the last frame it received. Asynchronous spacewarp additionally uses depth information to help compensate for perspective and other geometric changes.[5][6][7] Valve's early version called interleaved reprojection would make the application run at half frame rate and reproject every other frame.[8] A later variant by Valve is SteamVR Motion Smoothing, which builds upon regular asynchronous reprojection in being able to reproject two frames instead of one.[4]

See also

References