Meta VR Training Hub

Meta aimed to integrate immersive XR-based learning into their corporate training ecosystem. The project included deep platform integration, enterprise authentication, and experience personalization, all under branding, technical, and platform constraints.

Role

VR UX Designer

Target Hardware

Meta Quest 2/3

Industries

Corporate Training / Technology

Date

Apr 2022 - Jul 2023

Problem

Meta needed a VR-native system to deliver structured learning content, track progress across sessions, and feed data into its existing LMS ecosystem. No unified solution existed, especially not one that worked across both Unity and Unreal apps while handling enterprise authentication, module-level metadata, and real-time xAPI reporting.

There was a clear opportunity to create a scalable training launcher that felt native to VR, respected ergonomic needs, and worked just as well for a new hire as it did for a seasoned engineer launching technical training.

My Role

I led UX and interface design across multiple development phases, from early user flows and low-fi wireframes through to spatial layout and final implementation. This included:

  • Prototyping and validating layouts in ShapesXR and Unity builds

  • Designing scalable UI logic for nested training structures

  • Providing first-time user guidance through onboarding, tooltips, and intuitive interaction design

  • Presenting and iterating based on feedback from both Meta and Immerse stakeholders

  • Designing a gamified feedback concept that was implemented into the final build

Process

Early Exploration

We kicked off with a series of walkthroughs using ShapesXR to explore the affordances of training selection in VR: how much content could comfortably sit within a user’s peripheral view, and how onboarding flows should behave in a 3D space. After reviewing other training platforms, running collaborative whiteboarding sessions, and validating ideas in early headset tests, I mapped out core training structures (Program > Course > Module) and modeled key interactions like scroll, filter, launch, and logout.

From the start, we knew this couldn’t feel like a flat app ported into VR. But equally, it had to support practical needs: module metadata, progress tracking, and responsive input for both laser and touch-based interaction.


Wireframes & Layout

In Figma, I built modular layouts that allowed us to scale training content while keeping the interface legible and intuitive. The early design direction was tile-based, with each tile showing visual cues like overdue flags and completion ticks, tied to xAPI status in the background. These were continuously tested in VR and refined based on feedback from headset sessions. Tooltips and onboarding guidance were layered on top to support first-time users.

I paid close attention to UI ergonomics, ensuring panels hovered at a comfortable reading distance and used dynamic reorientation to support both seated and standing play.


Feedback Loops

As the Unity and Unreal apps evolved, I collaborated closely with QA and developers to verify visual hierarchy, menu states, and navigation logic. I also authored haptic and audio interaction specs to ensure feedback cues aligned with user input: clicks, transitions, and errors.

Suggestions from Meta’s internal UX research team helped guide refinements, including improving logout discoverability and reducing visual overload for nested content.

Solution

Solution

We delivered a spatial launcher UI that handled the complexity of enterprise training content, while keeping the experience clear, structured, and approachable in VR.

A key evolution in the design was the move away from a flat tile-based grid to a more hierarchical 'to-do list' style. This structure helped reduce visual overload, made progress clearer at a glance, and gave us more flexibility to support varied training needs and priorities.

The final experience included:

  • Hierarchical Launcher UI
    Organized around Core, Recommended, and Elective training paths, with expandable sections, full metadata and progress indication.

  • Dual-Input Support
    Fully compatible with both laser-pointer and touch-based interaction, driven by user preference.

  • Personalization
    Included welcome messages, dynamic sorting, and priority ordering based on course progression and user preferences.

  • Gamification
    Introduced a trophy-shelf room and mechanic that persisted between visits, with visual and audio feedback designed to encourage sustained engagement.

  • Surveys
    Designed a survey system within the app, optimized for VR, supporting Likert, multi-choice, and single-choice inputs. All results were tied into the xAPI pipeline for real-time reporting and LMS integration.

Outcome

  • Passed Meta UAT with no UI/UX blockers

  • Successfully implemented on both Unity and Unreal builds

  • Demonstrated working integration with Meta’s LMS, using custom xAPI pipelines

  • Showed spatial UX implementation that supported both new and returning users

  • Launcher and survey features contributed to Immerse securing a high-value licensing agreement with Meta

Reflection

What went well
This project was a strong example of aligning immersive design with an enterprise system. We had to juggle evolving brand guidelines, LMS integration, and strict platform requirements, while still creating an experience that felt intuitive, responsive, and immersive. The final result was ergonomic, scalable, intuitive, and engaging.


Challenges

  • We navigated mid-project shifts in brand identity without compromising usability.

  • We clarified complex backend requirements into UI logic that was intuitive and modular for users.


Takeaways
Designing for VR is always multi-sensory: spatial reasoning, sound, and tactile feedback need to work together to guide users clearly. Owning the experience from early sketches to in-headset implementation helped ensure it stayed coherent and resilient throughout. And even on interface-heavy projects, fast spatial prototyping in tools like ShapesXR proved invaluable for catching comfort and clarity issues early.