Insights From The Blog

AR Foundation and MARS work together in Unity

Unity is an excellent platform for building deeply interactive and intelligent augmented reality (AR) games and apps, and things are looking even brighter with the recently announced Mixed and Augmented Reality Studio (MARS) feature.

The MARS system gives the app and game developer the power to build AR experiences that fully integrate with the real world. With the systems complimenting each other, it is possible to build AR apps that are context-aware and responsive to physical space, working in any location and with any kind of data imaginable. AR just got real!

Any developer will tell you that perhaps the most challenging aspect of constructing AR systems is overlaying the virtual data onto the real world in a seamless and believable way. In the past, to be able to do this, the developer had to build a complete code model, test it in the real world, tweak it, and repeat as many times as necessary until the model actually looked authentic. That made it a time-consuming and labour-intensive process, and required a lot of skill. That not only restricts who can do it but makes it an unappealing task too, and programming should never be either of those.

MARS works directly and in tandem with the Unity AR Foundation tool, so that they compliment each other. The MARS tool takes data from your input device – Android or Apple mobile device, HoloLens or Magic Leap head mounted display etc – and quickly converts it into a virtual panorama for the AR to be overlaid by the AR foundation tool. Think of MARS as the inputs for the XR element and the AR Foundation tool as the outputs for those same XR experiences.

MARS has a real-time editor called Simulation View that provides a WYSIWYG development interface allowing you to change where you place objects and how those objects interact with the real world. The MARS system is built on top of the AR Foundation layer, and it supports horizontal and vertical surface plane detection, so you can give your objects parameters, such as when to appear, what type, height, or size of surface to appear on and so forth. Then, if a user’s AR device detects planes or surfaces with matching characteristics like height, size, shape, location, then your objects will automatically appear for them. It’s a simple as that.

For instance, if a developer has built a faithful AR version of a Ford car engine, if the user points their device with the enabled software on at a corresponding Ford car’s bonnet, then the engine components – to whatever detail that have been created in the software – will appear overlaid on the bonnet. Create a realistic AR version of a PCB, and the correct components will appear when the device is pointed at the real PCB, making it essential for carrying out checks. But it’s not just shapes, and if you build a game or app based on a flat surface, point your device at any flat surface to have the game or app appear.

The AR Foundation tool is new in the Unity kit. And MARS is still in beta testing and has not been released to the general public yet, but when these two pieces are finally released as real products, then AR is going to get very interesting indeed.