Insights From The Blog

New AR Foundation Features in Unity

Augmented Reality – AR is fast becoming the technology to watch in future applications. While VR is great at creating realistic worlds to immerse ourselves in, AR expands on the real world and allows us experience more with what we can already see.

It’s sheer power and potential to impact our lives on so many levels make it highly valuable. So valuable in fact, that from its current worth of around $10 billion, it is estimated to reach a hefty global value of around $198 billion by 2025. Key to this worth is the ability of AR to superimpose on the real world and in that way, it becomes useful to many, many industries, rather than just being a gaming or educational tool.

Unity is at the forefront of AR development with the AR Foundation feature that lets you build cross-platform AR applications in Unity engine.

AR Foundation is a standard part of Unity and is well documented, but is also always being upgraded to make your workflow even easier and the end results more powerful. It supports many devices, and is now compatible with the new iPad Pro running the ARKit 4 system.

The latest Apple iPad Pro comes fully equipped with a light detection and scanning (LiDAR) capability that provides enhanced scene understanding and real-world depth information. LiDAR is a method for measuring distances by illuminating the target with light and measuring the reflection with a sensor. Differences in light return times and wavelengths can then be used to make digital 3D representations of the target.

This system can bring a whole new level of realism to augmented reality experiences by establishing information about the real elements of the scene in front of it, and then superimpose virtual data on top of it in a seamless way. The AR Foundation 4.1 package has now been expanded to add new features from ARKit 3.5 and ARKit 4 that exploit this new depth data and scene geometry. New features on the latest release include:

  • The AR Occlusion Manager to incorporate depth information when rendering the background and take account of the distances when rendering AR objects. In the real-world there will always be collisions with virtual content, and the Occlusion Manager handles this by determining where these real elements are and compensating accordingly. When virtual content is closer to the camera than real-world content, the virtual objects appear in front of the real world as the virtual scene is rendered. Likewise, when the virtual content is farther away and behind real-world objects, the virtual content will not be rendered, leaving the real element in front. This intelligent rendering makes for much more realistic AR scenes.
  • An enhanced depth image function which allows distances to be associated with image brightness, and use this raw data for additional CPU-based processing and rendering. Depth images gives you the “depth” of the object or the “z” coordinate information of the object in real world. The intensity values in the image represent the distance of the object from a viewpoint, so that the software can make sense of its position in the real world to build virtual content around it.
  • The scene reconstruction utility scans the environment in the camera view to create mesh geometry representing the real-world environment. The virtual elements can then be overlaid and made to interact with respect to the forms and positions of existing entities.

The latest AR Foundation 4.1 is taking development of AR experiences to a whole new level, and here at Unity Developers, we are exploring the impressive abilities, so stay tuned for more on this exciting range of features.