
Meta Reality Labs Onboarding
Delivered as a single-user Oculus Quest experience using the Immerse SDK, this project bridged technical, narrative, and educational needs. It served as a branded, spatial learning journey for AR/VR teams during their initial ramp-up at Meta.
Role
VR UX Designer / Producer
Target Hardware
Oculus Quest / Meta Quest 2
Industries
Technology / Corporate Onboarding
Date
Nov 2019 - Jul 2020; updates through Jan 2021
Problem
Existing onboarding lacked immersive or memorable elements for AR/VR-focused hires.
Project Aims
Introduce complex technical and organizational topics in an accessible, engaging way.
Balance interactivity with cognitive clarity in VR.
Optimize for the memory and rendering limitations of Quest 1.
Stakeholder expectations required visual excellence, brand alignment, and measurable user engagement.
My Role
Producer (Led all aspects of UX, PM, and Creative Direction)
Communicated directly with Meta stakeholders to deliver sprint updates and reviews.
Led spatial UX strategy, scene transitions, and narrative pacing.
Created VO scripts, directed storyboarding and interaction design.
Directed implementation across Unity developers, artists, QA, and audio teams.
Recruited new technical talent and drove sprint planning and velocity tracking.
Contributed core interaction ideas and personally shaped several key immersive moments, from the orb intro to the skyline elevator finale.
Process
Discovery & Research
Drew on prior project experience and early VR research to shape a set of principles that guided design decisions throughout the project.
Worked closely with artists to source and interpret visual moodboards for environmental and interactivity design.
Design & Ideation
Co-led early storyboarding sessions (whiteboards, animatics).
Defined the scene logic, pacing, and motion-based transitions.
Developed dynamic interaction modes (gaze, hand, object manipulation).
Collaboration & Development
Ran Scrum processes with fortnightly sprints and closely-monitored sprint velocities.
Defined JIRA stories across UX, engineering, and audio pipelines.
Partnered with audio engineers for spatial sound and VO integration.
Directed shader development and visual transitions for scene changes.
Solution
A fully interactive, voice-narrated VR experience guiding new hires through Meta’s mission, timeline, products, and future roadmap. Built in Unity, the project balanced visual storytelling with hands-on interaction.
Key Interaction Types
Object manipulation: Touching a plasma orb, unplugging a PC, inspecting hardware, pulling a lever and pressing various buttons.
Physical engagement: Walking through doors, hitting a ball with a stylized bat, and a social interaction with avatars.
Scene & Audio Design
Thematically varied scenes, each with a custom soundtrack that built dynamically with narration.
Advanced spatial audio using ambisonics for immersive soundscapes.
Animated cinematic moments, such as the rotating Earth intro and a red carpet transition in the timeline, added polish and narrative rhythm
VR Accessibility
Scaled assets for seated and standing play, and supported roomscale and stationary modes.
Optional subtitles synced with VO, using a custom SDK tool and Amazon Transcribe.
Technical Architecture
Built in Unity.
Developed using a custom shader framework (uber shader) supporting layered visual effects and seamless transitions.
Outcome
Delivery Success
Sprint 13 completed with full functionality and content flow.
Early user testing and debug tools implemented.
Impact
Optimized visual fidelity and performance to meet the limitations of the Oculus Quest (1) hardware.
Influenced subsequent internal initiatives focused on interactive learning in VR.
Reflection
Key Learnings
True engagement in VR comes from interactivity, not spectacle.
Design for constrained hardware forces creative simplicity and clarity.
Early alignment on user experience goals prevents costly late-stage rework.
Challenges
Scope management during COVID constraints through creative reuse and modular design.
The team grappled with complex visual optimization challenges on Quest 1. One key constraint: anti-aliasing and bloom couldn’t run simultaneously, due to early Unity render pipeline constraints. Our approach and this challenge was reviewed by John Carmack. We agreed to keep bloom in the end, due to the impact it brought to all the custom shader effects.
Personal Growth
Deepened skills in spatial UX design, audio integration, and cross-functional leadership.
Reinforced the value of detailed previsualization and VO/audio-visual prototyping.