Meta Reality Labs Onboarding

Delivered as a single-user Oculus Quest experience using the Immerse SDK, this project bridged technical, narrative, and educational needs. It served as a branded, spatial learning journey for AR/VR teams during their initial ramp-up at Meta.

Role

VR UX Designer / Producer

Target Hardware

Oculus Quest / Meta Quest 2

Industries

Technology / Corporate Onboarding

Date

Nov 2019 - Jul 2020; updates through Jan 2021

Problem

Meta needed a VR-native onboarding experience to connect new hires with the company's mission and future roadmap. The objective was to move beyond passive video consumption and create a moment that felt inspiring and structurally significant for employees joining the Reality Labs division.


The primary challenge was technical constraint. We had to deliver a high-fidelity, brand-aligned narrative on the original Oculus Quest 1, which imposed strict limits on memory and performance. We needed to balance visual polish with a stable framerate while ensuring the experience remained accessible to users completely new to VR.

My Role

Although my official title was Senior Producer, I served as the lead for UX design and creative direction. I bridged the gap between the client’s vision and our technical implementation. My core responsibilities included:

  • Spatial and Interaction Design Defined the end-to-end user flow, scene pacing, and interaction mechanics, ranging from gaze-based triggers to physical hand manipulation.

  • Product Management Led Scrum processes, managed sprint velocity, and authored JIRA stories to align engineering and art teams with the design goals.

  • Creative Direction Directed the storyboarding process, wrote the voiceover scripts, and designed key cinematic moments such as the orb intro and the skyline finale.

  • Audio Strategy Collaborated with audio engineers to design spatial soundscapes and managed the casting and integration of voiceover assets.

Process

Defining the Narrative Flow
To move quickly from abstract requirements to a tangible vision, I produced an internal animatic video using my own scratch voiceover. This low-fidelity prototype allowed us to validate the pacing, emotional tone, and transition logic with Meta stakeholders before we committed to expensive 3D asset production.


Designing for Constraints
The memory limitations of the Quest 1 required a rigorous approach to spatial layout. I mapped out scene transitions that utilized a 'destruction' shader we developed (fragmenting the environment into glowing tiles) to swap scenes without breaking immersion or overloading the hardware. I led whiteboard sessions to define user positioning, ensuring that all interactive elements remained within reach regardless of whether the user was seated or standing.


Iterative Prototyping
We worked in fortnightly sprints, constantly testing builds in-headset. I tracked team velocity closely, which allowed me to identify early on that we needed to extend the timeline to maintain quality rather than cutting scope. This data-driven approach helped secure the necessary buy-in from leadership to polish the final interactions.

Solution

We delivered a 15-minute interactive narrative that guided new hires through Meta’s history, products, and future vision. The design prioritized 'doing' over 'watching' as much as possible to maintain presence.


Varied Interaction Design
I designed a progression of interactions starting with a simple touch-based activation of a plasma orb, evolving into more physical interactions such as pulling a virtual tether to disconnect a headset (symbolizing the wireless Quest), and pulling a lever to open the doors to view a space station vista.


Atmosphere and Audio
The experience used advanced spatial audio and ambisonics to ground the user. I worked with the audio team to synchronize musical swells and sound effects with specific visual cues, ensuring that key moments felt emotionally resonant.


Accessibility First
I designed a startup configuration menu that allowed users to choose between Roomscale or Stationary modes. We implemented a height-calibration system that dynamically adjusted the UI and interactive zones, ensuring the experience was fully accessible to wheelchair users or those playing seated.

Outcome

  • We completed Sprint 13 with a fully functional end-to-end experience that met the strict performance targets of the Quest 1 hardware.

  • The 'Go Team' interaction, where users high-five virtual avatars, was highlighted by stakeholders as a 'moment of delight', validating our strategy of prioritizing physical interaction over passive visuals.

  • The team successfully implemented a custom 'uber shader' and bloom effects to create impact while maintaining performance. This specific trade-off and optimization strategy was reviewed by John Carmack during development.

Reflection

Interactivity Drives Immersion
This project reinforced that 'wow' moments in VR often come from simple, physical agency rather than visual spectacle. The most memorable moments were those where the user physically influenced the world, like unplugging a cord or hitting a ball.


Technical Challenge
The team grappled with complex visual optimization challenges on Quest 1. One key constraint being anti-aliasing and bloom couldn’t run simultaneously, due to early Unity render pipeline constraints. We agreed to keep bloom in the end, due to the impact it brought to all the custom shader effects.


The Value of Pre-visualization
Creating the rough animatic early in the process was critical. It bridged the gap between the script and the spatial reality, preventing costly rework and ensuring the entire team was aligned on the narrative pacing from day one.