Insights From The Blog
Oculus is Expanding Quest Mixed Reality Capabilities with Enhanced Developer Tools
While Meta originally developed the Quest headset as a VR tool, plainly it has a lot more potential, and we understand that the company is now looking to expand its capabilities in mixed reality (MR) too, and it seems like the perfect product to demonstrate it.
On the face of it, Mixed Reality is easily confused with Augmented Reality, but there is more than a subtle difference between them; AR is the simple overlaying of virtual content on real-world scenes, like Snapchat or Facebook (Meta) face enhancements via camera shots, or Pokémon Go characters, while Mixed Reality overlays usable and integrable content on the real world. AR uses dumb content, while MR employs usable features.
Meta first unlocked the Quest’s mixed reality features last year via the Passthrough API, which enabled developers to access the headset’s pass-through video view for the first time. A simple double-tap to the front of the headset switched the view from VR content to a grainy real-world view. It was seen as a quick means for a user to check their environment, but it has demonstrated how the integral camera system had potential for more. Now, Meta is launching a more advanced set of tools, dubbed the Presence Platform, that will enable developers to create more sophisticated mixed reality apps.
Meta unveiled the new Presence Platform at Connect 2021 as part of its commitment to assisting developers generate new content. The Presence Platform seeks to unify all of the development tools required to create augmented, mixed, and virtual reality environments for Meta Quest. The developer toolkit comprises three main tools:
- Insight SDK. This primary building component is the Passthrough functionality, which developers previously had access to only in an experimental state. This functionality will transition from experimental to general availability with the next developer release and will allow developers to employ ‘spatial anchors’ to allow the placing and locking of virtual objects so that they appear in the same position at every start-up. Furthermore, Insight SDK includes a feature called ‘Scene Understanding’ which adds spatial awareness to the local environment and anything in it. The Scene Understanding feature will be able to identify features such as walls, doors and ceilings, as well as furniture placings and be able to work around them.
- Interaction SDK. This feature will provide Unity developers with a pre-built collection of basic hand and controller interactions, such as poking buttons, gripping items, aiming, and choosing various portions of the local, virtual, environment. The Interaction SDK’s objective is to provide standardised interaction patterns and to monitor the performance of particular interactions. The aim is that the technology will make it simpler for developers to create their own interactions and motions that will be displayed through the headset.
- Voice SDK. As you might expect, Voice SDK will make speech control available to Quest developers, which Oculus claims can be used to power basic navigation capabilities such as instantly launching your favourite Beat Saber song and gameplay elements and launching a voice-activated spell inside an app. The system is built on Meta’s open-source Wit.ai natural language framework.
All in all, this set of developer tools will offer serious creators the real possibility of designing MR experiences within the Oculus framework. Key to this development is that the Oculus is already pretty much set up as a hardware device, and users will be able to experience a growing number of MR projects via the pass-through feature, and is likely to be the next step in the company’s move towards a full-blown Metaverse. Detailed documentation for all of the developer tools, capabilities, and example projects is one of the primary advantages that creators may anticipate with the launch. Intriguingly, Meta has developed a sample experience dubbed The World Beyond, which should be available in the next few weeks. Developers can potentially use this example as a starting point for creating their own bespoke experiences in MR. So, with the tools available and a raft of background documentation available, it shouldn’t take developers long to get up and running with the system.
We at Unity Developers are very excited about this latest set of developer tools and can’t wait to start using them. Keep checking back to see how we get on.