Insights From The Blog

Virtual Reality in 2023: All the Technical Stuff That We Expect to Go Mainstream This Year

If technology were akin to human evolution, just about now you would likely be warming up the boiler in your steam-powered tractor to trundle to the cotton mill, or some such. Human evolution is very slow, taking perhaps hundreds of generations just to lose that coccyx – that triangular bone that consists of 3 to 5 fused segments and is the remains of our tails. By comparison, technological change takes place at a dizzying rate, and what may have been new and novel last year is a bit ‘old hat’ right now. This is best demonstrated in forward looking technology such as computing and with the relatively new kid on the block, VR and its different guises.

Despite being fairly new to the market, VR, AR, and XR are already a firm favourite with a growing number of developers, who are pushing ever greater technological advances in smaller packages. Always innovative, we can expect to see a number of new and perfected systems, such as:

  • Eye tracking. This feature is both simple and complex all in one go. As the distance between the centre of the pupil and the reflection of the cornea varies with eye angle, eye tracking normally relies on a constant measurement of this distance. The reflection is made possible by an infrared light that is undetectable to the human eye, and its movements are recorded and followed by cameras. Algorithms used in computer vision can figure out the direction of glance based on the angle of the eyes. However, in virtual reality, the eyes do not always point in the direction that the person is actually gazing. In the real world, we see something called “vergence,” in which the eyes’ angles meet in the middle at the object of our sight. Using a virtual reality (VR) headset equipped with eye tracking, you can give your avatar moving eyes in a simulated conference room setting. It allows for a more accurate representation of the user’s mood or perspective. This is also helpful for determining whether or not the user is trying to look outside the normal confines of their field of view. With just a few button presses, the virtual reality experience may be tailored to display the desired area or go through the appropriate menu.  Expect eye tracking to be a much bigger feature on the next iteration of headsets.
  • Field of View (FoV). Basically, this is used to describe how much of the virtual world the user can see in any given direction. A larger FoV in a VR headset allows the user to see more of their surroundings, making the experience feel more real. VR users are limited in their perceptual abilities when their FoV is small. The FoV that the user can expect from VR headsets is normally predetermined and the field of view is equated to this degree. The current average FOV of 89 degrees is found with the current Oculus Quest 2. When compared to the Quest 2, the Pico Neo 3 Pro’s 98° FoV is a bit more generous, while the VIVE Focus 3 features a FoV of around 120 degrees, making it one of the most expensive VR headset currently available. Users’ visual field is limited to the area directly in front of their eyes, so the edges of the screen are rendered with less fidelity. The race, however, is on to maximise the useful FoV while making all parts of the display equally applicable, for a more immersive experience.

  • Hand and body tracking. Currently, all VR systems employ handsets to dictate where the user’s hands are so that they can interact properly with the virtual tools, weapons, and sports equipment that make the VR experience really special. We know that Meta have been toying with hand tracking  on the Oculus and have made the function available as a beta, but by using the on-board cameras, developers are hoping to take tracking several steps forward. Current king of the technology in this area is thought to be Apple. The Apple VR headgear, when it launches, will likely function autonomously, allowing it to follow the user’s movements without requiring a complex external camera rig. Instead, cameras or other sensors like LiDAR will allow the headset to “see” the wearer. Objects in the user’s path can be detected using gyroscopes, and the user’s surroundings can be mapped using forward-facing LiDAR. Tracking the user’s hands is another useful function of controllers. The rumour mill is divided on whether Apple will create a virtual reality controller. It’s possible that the business has faith in the headset’s capacity to detect hands even when they’re not present.  We know that other developers are also working on these kinds of systems, so it’s exciting stuff indeed.
  • Haptics. When we think of haptics and feedback in gaming, we tend to consider it in terms of a worn body suit that gives small electric shocks to simulate bullet strikes.   While that notion has its place in VR, haptics is a much more intricate subject and many developers are looking at how feedback could be given in other ways. UK developers Foldaway Haptics are pretty advanced with a new forced-feedback controller joystick for VR handsets, while Go Touch VR are busily building touch sensors to give feel to using fingers. Make no mistake; these are going to be far beyond the controller rumble that promised white-finger to hordes of Xbox and PS users, and will hopefully enable VR users to actually get the sensation of touch in a virtual world. For a long time now, VR has astounded with its graphics – which will be even better in the next iteration – but has lacked real feedback. The range of new devices being planned for release are going to make the virtual experience significantly more exciting.

These features together with others such as Spatial Audio systems and true mixed reality (MR) via onboard camera units and LIDAR are going to be the basis of VR systems that become increasingly lifelike in their content. If game and app developers can utilise these features to their maximum, it’s going to become a very exciting time for virtual reality.