Ford VR Authoring

Originated within the HSSMI research partnership with Ford. Evolved from CAD import tests into a full authoring proof-of-concept built with the Immerse SDK.

Role

VR UX Designer / Producer

Target Hardware

Oculus Rift

Industries

Automotive Manufacturing / Industrial Design

Date

Jun 2018 - Mar 2020 (on and off development)

Problem

Operational Challenge

  • Accelerate production setup and improve efficiency by enabling operator training and layout feedback before workstations are physically built.


Technical Limitations

  • Early Unity tests were disrupted by inconsistent CAD hierarchies and mismatched scale.

  • No existing tools allowed real-time interaction sequencing or spatial validation in VR.


Usability Gap

  • High learning curve for non-developers wanting to create training scenarios.

  • Lack of tactile or feedback-driven user interfaces in traditional CAD or 2D-based tools.

My Role

  • Design Ownership: Led immersive UX design across spatial interaction, flow architecture, prototyping, and iterative testing

  • Cross-Functional Collaboration: Worked with Ford stakeholders, Immerse leadership, developers, and artists to align user flows and authoring logic

  • Documentation & Delivery: Produced user flows, state-driven interaction specs, interface logic, and co-created presentation storyboards

  • Project Management: Scoped and managed JIRA tickets, QA sessions, and design-to-dev handoffs

Process

Research & Discovery

  • Participated in CAD-to-Unity feasibility tests; identified model and scaling inconsistencies.

  • Used internal staff for frequent in-VR testing and iterative UX refinement.


System Design

  • Designed full platform flow from authoring to playback and data reporting.

  • Scoped MVP for early deployment, later informing Immerse SDK guide functionality.


Prototyping & Testing

  • Scripted, captured, and narrated the concept video demonstrating the full UX.

  • Personally tested and iterated on early VR interface designs within Unity, refining interaction mechanics based on continuous internal testing.

  • Proposed haptic feedback, snap-to-path mechanics, timeline editing, and contextual UI.

Solution

Interaction System

  • In-VR contextual UI for selecting and manipulating pre-defined model parts.

  • Supported both linear and freehand path recording, with immediate preview and editing.

  • Playback system included snap-based path guidance with visual and haptic confirmation.


Mocked-Up Future Capabilities (Demonstrated in Build)

  • Customizable haptic zones (rumble, click) along movement paths

  • Timeline-based sequencing (manual or path-driven)

  • Step-editable VO guidance with scrubbable timeline control

Outcome

Functional Deliverables

  • Working proof of concept deployed in Unity.

  • Concept video used for internal alignment and executive presentation.

  • Full UX flow diagrams, context menu specifications, and storyboard assets.


Key Results

  • Earned executive endorsement and contributed to a Ford 3-year roadmap (paused due to COVID-19).


Engineering Wins

  • Multiple implemented features based on UX specs: path definition, previews, animated snap targets, complex haptics, playback loops.

  • The authored UX directly influenced the creation of the Guides system in the Immerse SDK (Lines, Points, Planes for spatial alignment).

Reflection

Challenges Faced

  • CAD model inconsistency and scale required early pivots.

  • Creating intuitive spatial UI for complex 3D authoring workflows.

  • Designing for non-expert creators within a technical platform.


Highlights

  • Led immersive UX design from concept to prototype.

  • Applied system-level thinking across spatial interactions.

  • Influenced platform tooling through project-driven insight.