Meta Reality Labs Onboarding

Delivered as a single-user Oculus Quest experience using the Immerse SDK, this project bridged technical, narrative, and educational needs. It served as a branded, spatial learning journey for AR/VR teams during their initial ramp-up at Meta.

Role

VR UX Designer / Producer

Target Hardware

Oculus Quest / Meta Quest 2

Industries

Technology / Corporate Onboarding

Date

Nov 2019 - Jul 2020; updates through Jan 2021

Problem

Meta wanted to create a VR onboarding experience for new hires entering Reality Labs that made their first steps feel connected to the company’s mission, and cement what they were building toward. The goal was to design something both inspiring and informative.

This came with challenges: performance and memory limitations on Quest 1, strict runtime targets, and high expectations around brand alignment and visual polish.

My Role

I led the UX design, creative direction, and production planning across the project. That meant owning:

  • Interaction and spatial design: scene flow, motion pacing, interaction patterns

  • Voiceover scripting and integration: from drafting the script to implementation in build

  • Sprint planning, recruitment, and velocity tracking: including onboarding new technical talent

  • Cross-functional delivery: Unity implementation, asset review, QA, audio, and final debug

  • Stakeholder communication: delivered regular sprint updates and reviews directly with Meta

  • Creative execution: directed storyboarding and co-authored key immersive moments like the orb intro, product walkthrough, and skyline finale

Process

Discovery & Research

  • Drew on prior project experience and early VR research to shape a set of principles that guided design decisions throughout the project.

  • Worked closely with artists to source and interpret visual moodboards for environmental and interactivity design.


Design & Ideation

  • Co-led early storyboarding sessions (whiteboards, animatics).

  • Defined the scene logic, pacing, and motion-based transitions.

  • Developed dynamic interaction modes (gaze, hand, object manipulation).


Collaboration & Development

  • Ran Scrum processes with fortnightly sprints and closely-monitored sprint velocities.

  • Defined JIRA stories across UX, engineering, and audio pipelines.

  • Partnered with audio engineers for spatial sound and VO integration.

  • Directed shader development and visual transitions for scene changes.

Solution

A fully interactive, voice-narrated VR experience guiding new hires through Meta’s mission, timeline, products, and future roadmap. Built in Unity, the project balanced visual storytelling with hands-on interaction.


Key Interaction Types

  • Object manipulation: Touching a plasma orb, unplugging a PC, inspecting hardware, pulling a lever and pressing various buttons.

  • Physical engagement: Walking through doors, hitting a ball with a stylized bat, and a social interaction with avatars.


Scene & Audio Design

  • Thematically varied scenes, each with a custom soundtrack that built dynamically with narration.

  • Advanced spatial audio using ambisonics for immersive soundscapes.

  • Animated cinematic moments, such as the rotating Earth intro and a red carpet transition in the timeline, added polish and narrative rhythm


VR Accessibility

  • Scaled assets for seated and standing play, and supported roomscale and stationary modes.

  • Optional subtitles synced with VO, using a custom SDK tool and Amazon Transcribe.


Technical Architecture

  • Built in Unity.

  • Developed using a custom shader framework (uber shader) supporting layered visual effects and seamless transitions.

Outcome

Delivery Success

  • Sprint 13 completed with full functionality and content flow.

  • Early user testing and debug tools implemented.


Impact

  • Optimized visual fidelity and performance to meet the limitations of the Oculus Quest (1) hardware.

  • Influenced subsequent internal initiatives focused on interactive learning in VR.

Reflection

Key Learnings

  • True engagement in VR comes from interactivity, not spectacle.

  • Design for constrained hardware forces creative simplicity and clarity.

  • Early alignment on user experience goals prevents costly late-stage rework.


Challenges

  • Scope management during COVID constraints through creative reuse and modular design.

  • The team grappled with complex visual optimization challenges on Quest 1. One key constraint: anti-aliasing and bloom couldn’t run simultaneously, due to early Unity render pipeline constraints. Our approach and this challenge was reviewed by John Carmack. We agreed to keep bloom in the end, due to the impact it brought to all the custom shader effects.


Personal Growth

  • Deepened skills in spatial UX design, audio integration, and cross-functional leadership.

  • Reinforced the value of detailed previsualization and VO/audio-visual prototyping.