Yale & 2U Cardiac Exam VR Pilot

Commissioned by 2U and Yale School of Medicine, the pilot addressed the need for flexible, accessible, and high-fidelity medical training tools that reinforce classroom and hospital-based instruction.

Role

VR UX Designer / Producer

Target Hardware

Oculus Quest 2

Industries

Medical Education

Date

Jul 2022 - Oct 2023

Problem

  • Medical students had limited practical opportunities to practice comprehensive cardiac exams before interacting with real patients.

  • Existing simulations lacked realism, interactive fidelity, or were non-immersive (e.g., 2D videos or mannequins).

  • The client demanded high clinical fidelity, voiceovers, scoring systems, and multiplayer, within limited timelines.

  • Midway through the project, unforeseen leadership disruption and unclear sprint priorities destabilized development and introduced delivery risk.

My Role

UX Designer Responsibilities:

  • Led design in collaboration with engineers, QA, and Yale’s Program Director. Facilitated stakeholder feedback loops to align clinical realism and educational goals.

  • Defined user flows, hand interactions, audio/haptic feedback systems, and UI logic using ShapesXR.

  • Authored and prototyped immersive interactions (e.g., palpation, auscultation, patient dialogue).

  • Developed original features designed for enhanced spatial learning, including nuanced proximity and snap-target interactions for palpation and auscultation.


Interim Producer Responsibilities:

  • Assumed PM duties during project instability.

  • Led standups, audited the backlog, scheduled sprints, and renegotiated scope with 2U and Yale.

  • Stabilized the team, hit critical milestones, and maintained client trust through transparent communication.

Process

Research & Discovery:

  • Consulted directly with Yale’s program director and subject matter experts to ensure procedural accuracy.

  • Leveraged leading clinical guides and established examination standards to design the exam flow and interaction logic.


Design:

  • Prototyped all interaction flows in ShapesXR and Unity (e.g., stethoscope mechanics, patient states, menu systems).

  • Designed immersive UI elements and game-flow (tablet menus, floating prompts, gesture logic).

  • Developed multimodal interfaces incorporating VO, audio cues, and haptics.


Iteration:

  • Conducted internal headset tests and SME feedback cycles to refine usability and realism.

  • Logged and tracked user issues via JIRA; authored detailed UX tickets tied to logic, scoring, and flow adjustments.

  • Project Management:

    • Delivered sprint-level planning and velocity tracking.

    • Negotiated staggered delivery-phases to preserve quality within deadlines.

Solution

Modes Implemented:

  • Learn Mode: Step-by-step guided training with instructional voiceovers and anatomical overlays.

  • Practice Mode: Semi-guided exploration with scoring hints and corrective feedback.

  • Assessment Mode: Unguided diagnostic challenge with live scoring and patient condition variability.


Key Features:

  • Multiple patient avatars with variable pathologies (MI, COPD, etc.).

  • Interaction fidelity: hand gestures mapped to palpation zones and specific controller gestures, snap/feedback logic, and tool use.

  • Clinical fidelity: dynamic auscultation with spatialized heart sounds and context-sensitive audio cues.

Outcome

Impact:

  • Delivered a functioning VR training simulation with high clinical realism and minimal post-production revision.

  • Successfully implemented all core features (scoring, multiple patients, guided modes, multiplayer logic).


Recognition:

  • Developer feedback cited UX mapping as “highly efficient and implementation-ready.”

  • Credited by senior developer for “salvaging” the project.

  • Successfully aligned with Yale Medical School stakeholders on the design vision, ensuring clear interface flow and patient logic.


Project Wins:

  • Multimodal integration (gesture, voice, haptics) approved by subject matter experts.

  • Scoring logic supported multiple exam pathways, preserving realism and user autonomy.

Reflection

What Went Well:

  • Spatial prototyping in ShapesXR eliminated rework, nearly all elements went to final without major redesign.

  • The flexibility to move between roles (Designer → PM) kept the project on track.

  • The project showcased VR’s full potential for clinical education and made abstract skills physically embodied.


Challenges Faced:

  • Mid-project team disarray demanded diplomacy, leadership, and rapid triage of incomplete tasks and blockers.

  • Designing tactile, voice-responsive interactions in a dynamic VR space required nuanced iteration.


Takeaways:

  • VR UX design requires not only user-centric thinking but also spatial, embodied logic and cross-sensory integration.

  • Leveraged my significant project-management experience to provide crucial leadership during mid-project uncertainty, ensuring design execution and project stability.

  • Documentation and prototyping tools like ShapesXR are indispensable for communicating complex interaction systems.