AstraZeneca VR Training Modules

Developed core design principles and approach for three training modules. Each module translated highly procedural pharmaceutical tasks into interactive, hands-on virtual environments, with single-player and multiplayer scenarios.

Role

VR UX Designer

Target Hardware

Meta Quest 2/3, WebGL

Industries

Pharmaceutical Training

Date

Jan 2021 - May 2022

Note: All visuals shown are from an indicative pharmaceutical demo I designed to reflect the AstraZeneca projects while maintaining confidentiality.

Problem

  • AstraZeneca needed widely adoptable, accurate training tools for complex drug manufacturing processes.

  • Traditional training was time-intensive, with limited real-time feedback and challenges in ensuring consistency across operator experiences.

  • Some scenarios required coordinated tasks and workflows to be represented in both single and multiplayer modes.

My Role

  • Lead UX Designer responsible for designing interaction flows, environmental fidelity, and learning structures in VR.

  • Undertook user research, system mapping, interface logic, and VR-specific UI/UX design.

  • Collaborated closely with developers, project management, QA, and stakeholders across projects.

  • Authored voice-over (VO) scripts, error logic, and guidance flows for both guided and unguided modes.

  • Provided implementation guidance and asset naming conventions for consistent handoff.

Process

Research & Discovery:

  • Conducted on-site studies at AstraZeneca labs, capturing over 400 photos/videos of tools, spaces, and workflows.

  • Mapped spatial layouts and tool zones using measurements and ergonomic posture references.

  • Interpreted SOPs and training manuals to define detailed interaction sequences.


Design & Prototyping:

  • Defined state machine logic in spreadsheets with every action mapped to a system state.

  • Created spatially accurate room diagrams, tool zones, teleport paths, and controller mapping diagrams.

  • Carried forward design insights documented from previous projects, to help guide system fidelity and instructional tone.

  • Iterated early builds through live UX reviews with stakeholders and QA.


Collaboration & Development:

  • Delivered annotated interaction scripts and VO timings aligned with system prompts.

  • Worked closely with engineers to align implementation with UX logic and interaction affordances.

  • Designed single- and multi-user coordination systems, including ghosted-hands for role simulation.

Solution

Module 3 & 4:

  • A solo-mode VR experience focused on precise interaction sequences, visual/audio cues, and detailed feedback structures. Included beginner, learning, and practice modes.


Module 5:

  • A single or two-player collaborative VR simulation that supported real-time coordination. Features included synchronized object passing, dual VO flows, and error-trigger resets.


Interface Features:

  • Snap-to guides, contextual highlights, hand tracking constraints

  • VO-driven progression gates and error correction

  • Tracked user actions (e.g., dropped items, speed errors) and provided corrective guidance through haptics, voice-overs, and visual cues to correct mistakes and ensure accuracy.


Training Architecture:

  • Shared WebGL and VR headset support.

  • Role simulation and observer/operator logic.

  • Contextual onboarding and interactions that precisely mirrored real-world procedures.

Outcome

Measured Results:

  • 50% reduction in training duration

  • 52% faster achievement of operator competency

  • Deployed across AstraZeneca’s internal VR training programs with ongoing influence on subsequent modules


Delivery Quality:

  • Final builds reflected all authored UX logic, interaction design, and VO scripting.

  • Successful QA pass for both solo and multiplayer modes.

  • Positive internal feedback and client recognition.

Reflection

What Worked Well:

  • Close integration of field research with design documentation ensured real-world fidelity.

  • Cross-functional collaboration was key to accurate implementation of complex state logic.

  • Branching flows and feedback systems enhanced adaptability for different trainee levels.


Challenges:

  • Hand-tracking precision and VO cue alignment required repeated iteration.

  • Multiplayer syncing in module 5 introduced unique technical and UX hurdles.

  • Iterative QA revealed nuanced interaction bugs tied to controller behavior and spatial occlusion.


What I Learned:

  • Structured documentation is critical for delivering robust, complex training in VR.

  • Research-grounded VR design translates to measurable real-world impact when accuracy and immersion are prioritized.

  • Voice, spatial, and tactile feedback must be intricately coordinated for high-fidelity immersive training.