Welcome to the community hub built on top of the Foveated rendering Wikipedia article.
Here, you can discuss, collect, and organize anything related to Foveated rendering. The
purpose of the hub is to con...
Research into foveated rendering dates back at least to 1991.[5]
At Tech Crunch Disrupt SF 2014, Fove unveiled a headset featuring foveated rendering.[6] This was followed by a successful kickstarter in May 2015.[7]
At CES 2016, SensoMotoric Instruments (SMI) demoed a new 250 Hz eye tracking system and a working foveated rendering solution. It resulted from a partnership with camera sensor manufacturer Omnivision who provided the camera hardware for the new system.[8][9]
In July 2016, Nvidia demonstrated during SIGGRAPH a new method of foveated rendering claimed to be invisible to users.[1][10]
In February 2017, Qualcomm announced their Snapdragon 835 Virtual Reality Development Kit (VRDK) which includes foveated rendering support called Adreno Foveation.[11][12]
During CES 2019 on January 7 HTC announced an upcoming virtual reality headset called Vive Pro Eye featuring eye-tracking and support for foveated rendering.[13][14]
In December 2019, Facebook's Oculus QuestSDK gave developers access to dynamic fixed foveated rendering, allowing the variation in level of detail to be changed on the fly via an API.[15]
According to chief scientist Michael Abrash at Oculus, utilising foveated rendering in conjunction with sparse rendering and deep learning image reconstruction has the potential to require an order of magnitude fewer pixels to be rendered in comparison to a full image.[18] Later, these results have been demonstrated and published.[19]
Eye-tracked foveated rendering was demonstrated in products such as the Meta Quest Pro (released in 2022) and the Sony PlayStation VR2 (released in 2023) headsets.[20]