Insights From The Blog

Cross-Platform XR Development: Challenges and Solutions

One of the big issues facing developers nowadays is releasing your shiny new game or app on multiple devices. Consumers see upcoming releases in the popular press and instantly hope that they will be available on their preferred platforms. Typically, releasing a game or app on multiple platforms is essential for financial success; however, ensuring compatibility can be a nightmare for developers. 

In this article, we will explore the best strategies for developing cross-platform applications and how to optimise your designs for maximum compatibility across all XR devices.

The Problem of Cross-Platform Compatibility

In its most basic form, cross-platform XR guarantees that these immersive experiences are accessible on a diverse range of devices, including desktops, mobile phones, VR/XR headsets, and AR eyewear. The idea is that the content remains consistent and synchronised, regardless of whether you are accessing it via a smartphone or a high-fidelity Meta Quest 3 headset. Developing this means that, regardless of the device on which it is performed, the play experience remains consistent. 

The term “cross-platform XR development” refers to the process of creating an app with a single codebase that can be used on several devices and platforms. Unity and its XR Interaction Toolkit are two important game engines enabling this, as with standards like OpenXR, which abstract device-specific inputs to guarantee hardware compatibility. By deploying the experience to more consumers from a single project, developers can reach a bigger audience using this strategy.

OpenXR is a royalty-free, open standard for the development of virtual and augmented reality applications compatible with diverse hardware and platforms. It offers a unified set of APIs, streamlining development by enabling developers to write code once and execute it across various devices without the necessity for individual platform adaptations. This decreases development time and expenses while expanding the market for device manufacturers that embrace the standard.

The OpenXR specification is an essential component of the mixed reality ecosystem. Previously, application developers were required to create applications for each XR headset individually using proprietary APIs; now, they possess the capability to develop for several devices simultaneously. OpenXR enables the rapid retargeting of programs for new platforms or allows them to operate on new systems without modification. This reduces the barrier to entry and facilitates the integration of new developers and concepts that will contribute to the ubiquity of mixed reality in our lives.

However, consistency has never been more crucial. In an information-abundant yet experience-deficient society, the genuine allure of XR resides in its capacity to unite individuals. Users are increasingly inclined to discuss their online experiences, regardless of the platforms they utilise. As we progress towards a comprehensive and inclusive Metaverse, shared experiences will become increasingly prevalent and significant.   

Multiplayer XR experiences create communal environments that facilitate real-time interaction among individuals, such as a virtual concert where VR and AR participants observe identical digital performers, or a worldwide team collaborating on a unified 3D prototype situated in physical space. This type of contact creates new opportunities for entertainment, collaboration, and community development. However, they want it to be seamless, which means it becomes essential for different devices to work together. But what are the practical aspects of that?

Building Cross-Platform Experiences

Currently, the XR environment is fragmented and non-cohesive. Devices such as the Meta Quest, Apple Vision Pro, HoloLens, and other mobile devices have distinct hardware capabilities, operating systems, and input modalities. This variation presents a challenge: how can developers deliver consistent, high-quality experiences without creating different apps for each platform? For organisations looking to employ an AR/VR developer, cross-platform solutions are critical to maximising reach and efficiency.

For the Unity software, cross-platform integration has been a hugely important part of the software from its early development, and because of that, it is especially good at it. Unity provides a robust engine for augmented and virtual reality development, accommodating a diverse array of platforms, including Oculus, HoloLens, ARKit (iOS), ARCore (Android), and WebXR for browser-based applications. Let’s have a look at some of the specifics in developing cross-platform experiences in this popular software.

Cross-Platform Development in Unity

Unity’s impressive XR Interaction Toolkit is an essential part of developing for multiple platforms and makes it easy to do. Any company that develops games in Unity can benefit from its uniform API for handling inputs, interactions, and movement across AR/VR devices. With Unity, you can install the toolkit straight from the Package Manager and include pre-built components, like XR Controller and XR Grab Interactable, as well as abstract device-specific inputs, which ensure compatibility across platforms.

With Unity, cross-platform development enables developers to create an application that, when targeted individually for either Meta or SteamVR platforms, functions seamlessly with minimal additional effort. Cross-platform input support typically includes a 6DOF HMD (six-direction Head Mounted Display) and controllers, such as the Oculus Rift S and Touch, the HTC Vive and controllers, and the Windows Mixed Reality headset with motion controllers.

OpenXR is linked to Unity and can be invoked at any time in your project. To enable OpenXR in your project, follow the steps below:
1. Open the Project Settings window (menu: Edit > Project Settings), and select XR Plug-in Management.
2. Enable the OpenXR option and any Feature Groups for the runtimes you intend to target.
3. In the OpenXR > Features tab, select the interaction profile of the device you are testing with.
4. In the OpenXR tab, verify that the current active runtime is set to the hardware you are testing with.

The Unity Features component enables third parties to augment Unity’s foundational support for OpenXR. They integrate the capabilities of OpenXR specification extensions into the Unity ecosystem; however, Unity is not engaged in their development.

Features comprise a compilation of Unity Assets that can be disseminated via the Package Manager, Asset Store, or alternative methods. If you are unsure about how the OpenXR package will work with your application, you can open a mock OpenXR environment to help you to test your own features and low-level code that makes use of native OpenXR functions.

Alternatively, if you find developing in Unity challenging, consider entrusting your project to our team of professional developers to alleviate your concerns. Contact us and let’s discuss your project.