Immerse App

Developed as a fork from the Meta Training Hub, the Immerse App evolved into a commercial VR LMS product with real-world user deployment and integration with backend administration control.

Role

VR UX Designer

Target Hardware

Meta Quest 2/3, Pico Neo3/4

Industries

Education Technology / VR Learning Platforms

Date

Q4 2022 - Present

Problem

  • The project began as a fork of the Meta VR Training Hub, addressing the lack of a centralized and flexible VR training system for clients. Existing solutions weren't built for scale and offered limited support for structured learning and progress tracking.

  • Clients needed the ability to control branding within any LMS used by employees, especially in immersive environments.

  • There was a clear demand for multi-user collaboration in VR, specifically around interacting with and managing spatial layouts in real time.

My Role

  • Front-end Design Ownership: Led UX/UI system design from early wireframes to production-ready VR interfaces.

  • Interaction Systems: Designed logic for VR keyboard overlays, motion constraints for menus, passthrough, input ergonomics, and an inventory system that supported Collaborative Spaces: a scene editing tool used in shared spaces.

  • Collaboration & UX Strategy: Worked across design, engineering, and product. Facilitated reviews, scoped features, guided Unity integration, and narrated demo walkthroughs.

  • Design System Development: Developed a dynamic theme engine using Figma Variables, which served as a core component of a Unity theming pipeline via JSON output

Process

Discovery:

  • Reviewed LMS platforms to inform dashboard layout and module navigation.

  • Investigated automated keyboard layouts for multilingual VR input.


Prototyping & Testing:

  • Used ShapesXR + Figma integration to prototype VR interactions in headset.

  • Set up internal usability tests for movement logic, menu bounding, contrast visibility, and guest login flows.


Design Implementation:

  • Created UI states and flows for guest sign-in, authentication, session save/resume, assignment completion, and form interactions.

  • Authored JIRA stories covering design refinements, component behaviors, and system edge cases.


Cross-Team Collaboration:

  • Collaborated with developers for the theme JSON (pushing to a central repository).

  • Worked with other designers to align frontend VR interface with backend web dashboard views.

Solution

Modular UI Panels:

  • implemented floating UI components that could be adjusted and anchored, designed with contextual tooltips, and built with layered spatial awareness. UI components were positioned at varying depths and heights to maintain comfort, clarity, and prevent visual overload.


Guest & Authenticated Flows:

  • Designed distinct user experiences for guest and authenticated users, including screen transitions, and virtual keyboard integration for guest sign-in.


Collaborative Spaces:

  • Designed modular panel layouts for Collaborative Spaces. This included:

    • Defining session logic and content state management for multi-user collaboration features.

    • Managing session creation and filtering.

    • Switching between session and saved states.

    • Viewing participants.

    • Indicating available or unavailable content.


Theming System:

  • Built a live-update theming framework using dynamic tokens, ensuring visual accessibility (contrast, readability) and brand customization.


Motion Constraints:

  • Defined a menu movement model that reduces disorientation by constraining UI behavior within user-centric bounds.

Outcome

Deployed LMS in Live Customer Environments:

  • Enabled onboarding for enterprise users with both authenticated and guest pathways.


Improved UX:

  • A targeted UI rework, including the relocation of navigation filters, demonstrably improved intuitiveness.

  • The theming system enabled adaptive text contrast improving accessibility.

  • Added passthrough toggle for Mixed Reality (MR).


Design Adoption:

  • Theme system, session tagging, and layout patterns were adopted into production and mirrored in Unity builds.


Validated UX process:

  • Internal demos, narrated videos, and animation briefs effectively communicated UX intent.


Functional MVPs Delivered:

  • Collaborative Spaces, multilingual input support (prioritized), and VR LMS structure launched successfully.

Reflection

Notable Challenges:

  • The guest user experience required considering UX simplicity and input handling (like the virtual keyboard) with considerations for how returning authenticated sessions would still work.

  • A design change to enable readability in passthrough mode due to Quest system restrictions.

  • Supporting both controller and hand input in overlapping interaction zones.


Lessons Learned:

  • Real-time VR prototyping (e.g., with ShapesXR) accelerates iteration and catches spatial design flaws early.

  • Token-driven design systems scale best when paired with developer-ready outputs (e.g., JSON).

  • Designing for VR demands tight collaboration across UX, engineering, and product, especially when small interaction details can impact comfort and comprehension.


Forward Thinking:

  • Next iteration could explore real-time multi-user annotation in Collaborative Spaces.

  • Enhance multilingual support with predictive text or voice-input overlays.