📰What’s New in Meta XR SDK v85 | Start Mentor Workshop
Hosted by Quentin Valembois (Valem) • March 2026 • Meta Horizon Start
Overview (what v85 focuses on)
Meta Horizon OS v85 includes updates across:
- Building Blocks (including Multiplayer + AI Building Blocks)
- Mixed Reality (new Spatial Test Framework and Scene-less MR support)
- Locomotion updates (input mapping + revised control scheme)
- What’s coming next (FrameSync)
Watch this part: 00:00
Building Blocks (Unity): faster setup, less boilerplate
Building Blocks are modular, drag‑and‑drop capabilities for Unity projects that can automatically configure project settings and required components (e.g., passthrough, hand tracking, MR setup, etc.). The intent is to let you focus on the code that’s unique to your experience.
What to take away:
- Building Blocks = the “fast path” for adding platform features correctly.
- They can handle annoying setup details (settings, manifests, component wiring).
Watch this part: 01:07
Multiplayer Building Blocks: Photon Fusion 2.1 support
What’s new in v85
Multiplayer Building Blocks now support Photon Fusion 2.1, unlocking newer Fusion features for Start devs who choose Fusion as their networking provider.
Fusion 2.1 highlights:
- Forecast Physics
- Object Send Priority
- Large Data RPCs (Fusion used to have a small payload limit—large data RPC expands what you can send)
- Custom Tick Rates
- Faster Host / Master Client Switching
- Configurable AOI (Area of Interest)
- Player Unique ID
When you drop a Multiplayer Building Block (e.g., auto matchmaking), you can choose provider:
- Unity Netcode for GameObjects, or
- Photon Fusion (now v2.1 supported)
Also, voice chat support in the Building Blocks flow is tied to Fusion in the demonstrated setup.
Watch this part: 02:19
AI Building Blocks updates: more accurate boxes + image segmentation
What’s new in v85
AI Building Blocks improvements:
- More accurate 3D bounding boxes for object detection overlays
- 2D Image Segmentation option appearing in the tooling
Watch this part: 06:45
“Scene-less Mixed Reality”: why it matters
The problem with relying only on Scene Model/room setup
Scene Model (room scan) is powerful, but can be:
- Overkill for simple MR interactions (e.g., “place one object on a surface”)
- Not runtime-updated (doesn’t naturally account for small moving objects in the moment)
The v85 direction: more seamless MR workflows
The workshop emphasizes using depth-powered environment raycasting to collide against real geometry without requiring a full scene model workflow.
Key concept: Environment Raycast uses depth sensing to “hit test” the real world, enabling placement and interaction without scene model dependency.
Watch this part: 08:27
What’s new for Scene-less MR in v85: Scene-less World Lock
World Lock (and what changed)
World Lock keeps virtual content stable in physical space even when the user recenters. Previously, this was tied to scene model workflows.
What’s new in v85
Scene-less World Lock—world-lock behavior that works without requiring a scene model.
Why it’s a big deal: It moves MR closer to “drop in/runtime MR” without asking users to preconfigure a full room scan for basic anchoring behavior.
Watch this part: 12:11
Debugging MR got a major upgrade: Spatial Test Framework
Before: Immersive Debugger
The Immersive Debugger can help you inspect/tweak MR-related settings in a build and visualize MRUK data (mesh, collisions, navmesh overlays, etc.).
New in v85: Spatial Test Framework (MRUK tests)
The Spatial Test Framework brings automated testing principles (Unity Test Framework) into MR workflows by letting you run tests across multiple room prefabs/scene configurations.
What it enables:
- Automatically validate your MR logic across many room layouts (bedroom, office, living room, etc.)
- Reduce manual “try it in 20 different spaces” testing
- Write your own tests by extending the MRUK test base class (as demonstrated conceptually)
Watch this part: 18:50
Locomotion: input mapping + revised control scheme
v85 includes an update to locomotion documentation and recommended patterns around locomotion input mapping, focused on making multiple locomotion systems work more cleanly together (especially avoiding input conflicts like “UI ray + teleport ray at the same time”).
What’s new: Revised Control Scheme
Watch this part: 26:24
More to come: FrameSync on Meta Horizon OS
FrameSync is as an upcoming/important performance-related development, aiming at:
- More consistent smoothness
- Fewer still frames
- Lower motion-to-photon latency
Watch this part: 29:24
Quick reference
- Photon Fusion 2.1 “What’s new” (from slide): https://doc.photonengine.com/fusion/current/getting-started/preview-2-1/whats-new-2-1
- Locomotion input mapping doc (from slide): https://developers.meta.com/horizon/design/locomotion-input-maps/
- FrameSync blog (from slide): https://developers.meta.com/horizon/blog/framesync-meta-horizon-os
- Environment Raycast example video (from slide): https://youtu.be/r9gedHRY0rc








