AR filter of alien face over woman's face

XR Experiences Are Becoming Easier to Build and Share


Share on:

Back in September, I was at Oculus Connect 6 with Mike Harris, Sr. Immersive Tech Developer at CrossComm. Mike wrote an article about some of the biggest announcements from Oculus Connect. I wanted to follow his article up with some of my own takeaways.

As Mike mentioned in his article, it is clear that Facebook is looking to expand the audience of virtual reality as evidenced by their announcements around social and business uses of the technology. But XR (a description that encompasses both AR and VR) is also starting to become easier to create, and more accessible to those who want to build their own experiences.

Outside OC6

Creating XR Just Got a Whole Lot Easier

Unity XR Platform Architecture

In one of the sessions we attended at OC6, J.J. Hoesing (a Senior XR engineer at Unity) discussed how Unity has changed its platform to be more flexible for developers. Here are some of the changes.

Unity XR Integration

Previously, all of Unity’s XR platform integration was compiled into the engine. This was convenient, at the time, because everything was available in one place for the user as soon as the engine was downloaded. The problem with this approach, however, was that it wasn’t very flexible—it was difficult for developers to update their code and they had to wait for a Unity version update to release the latest versions of their software development kit (SDK). Now, Unity has introduced a subsystem-based plugin approach that keeps platform-specific XR code outside of the engine in packages, allowing developers to freely make updates.

Unity XR Interaction

The Unity XR Interaction framework is another package that is added to the package manager. The framework handles cross-platform input, object interaction (distance grab, throw), haptic feedback, canvas UI interactions, teleportation and a couple of VR camera rigs. Users can easily add interaction components to their game objects using the XR Helpers menu. This allows them to easily add interactions to a scene, without having to write any code. The package also comes with a handy interactions debugger that can be used to see the game objects that are interactable and view their state.

Spark AR

Spark AR studio is a free engine by Facebook to build AR experiences for people with little to no programming experience. Most Spark AR experiences can be made without any programming, as long as you have all the art assets you need in the experience. For more advanced experiences, there is an easy-to-learn visual scripting tool, and for the programmers out there, you can get into even greater detail using javascript.

Spark AR has various features (like face tracking, plane detection, and background segmentation) that can be activated through a simple drag and drop mechanic. The only limitation is that the app needs to be pretty lightweight (i.e. 20mb) and a maximum of 50 in-scene objects is recommended for the best performance. So the assets that are used need to be optimized.

Spark AR also has a built-in material editor to get the desired effects you need. Once the app is complete, you can test it directly on your phone by using the camera inside of Facebook or Instagram. No additional steps are needed. Once ready, it can be published as a camera filter for either of those apps.

Sharing VR is About to Get a Whole Lot Easier Too

Creating mixed reality videos for the Quest has also become pretty straightforward. Oculus has created plugins for OBS(Open Broadcaster Software) and game engines to easily capture mixed reality videos. Connecting OBS to the Quest requires a Mixed Reality Capture (MRC) plugin that helps OBS receive audio and video from the Quest. After setting up both the physical camera that records the video in OBS and the MRC plugin, the game or app content in the Quest will show as a mixed reality video feed in OBS. For capturing content within an engine like Unity and Unreal, the MRC plugin is already present in the latest Oculus Package.

A persistent challenge for MRC is that it still requires a green screen to properly overlay the player onto the virtual world—though we believe it is only a matter of time before the use of Machine Learning to segment a person out of a scene (like ARKit’s people occlusion capability) becomes more widespread.

Creating XR content is becoming easier everyday, with new engines and drag-n-drop packages being introduced. Even users with little to no programming background can now create simple experiences using engines like Spark and Unity. XR is now being used in every field from construction to medicine, and we can expect a future where the lines between physical and digital continue to blur. Granted, these entry-level tools and drag-n-drop interfaces are no match for the tool sophistication and developer know-how necessary to serve these industries well. But, with less barriers to entry, more innovative minds can begin shaping what the exciting future of XR looks like.