Lumeto

Designing a scalable XR training platform from concept to enterprise deployment.

0 → 1 enterprise XR training platform deployed to 15,000+ users.

Role
UX/UI Designer
Tenure
Jan 2021 – Nov 2021
Scope
Involve XR — Multiplayer training platform for healthcare and public safety
Environment
Enterprise XR · Research-led design · Unity collaboration
Technologies
Figma · Unity (collaborative) · XR interaction systems

Context

Traditional actor-led crisis simulation training is expensive, difficult to scale, and logistically complex. As a result, trainees often receive limited practice before facing real world high-stakes scenarios.

Lumeto set out to build a scalable, data-informed XR training ecosystem that could support both synchronous multiplayer simulations and asynchronous practice environments.

Jay Street XR training environment used for scenario simulations

Jay Street XR training environment used for scenario simulations.
Environment design supported spatial navigation and scenario immersion.

The Challenge

Design immersive training experiences that:

  • Preserve realism
  • Support measurable learning outcomes
  • Scale across institutions
  • Operate within XR performance and interaction constraints

All while transitioning from concept to deployable enterprise platform.

What I Did

  • Led UX across end-to-end VR and web-based training flows
  • Translated SME-driven curriculum into structured interaction systems
  • Conducted stakeholder interviews with training facilitators and domain experts
  • Synthesized research into journey maps and scenario logic
  • Defined core design principles: Immersion, Interactivity, Standardization
  • Delivered a Proof of Concept within a 3-month timeline featuring 4 distinct scenarios
  • Designed information architecture across both VR and web interfaces
  • Collaborated with Unity engineers to refine interaction behavior and performance constraints
  • Contributed to narrative design, environment planning, and avatar direction
Controller interaction model designed for intuitive VR navigation and object interaction

Controller interaction model designed for intuitive VR navigation and object interaction.
Mappings were tested in simulation scenarios to ensure actions such as teleportation, view rotation, and object manipulation remained discoverable under stress.

System Architecture

The platform functioned as:

  • A synchronous multi-user training hub
  • An asynchronous practice and assessment environment
  • A cross-surface experience (VR + Web)

Design decisions accounted for:

  • Spatial interaction logic
  • Motion capture integration
  • NPC behavioral design
  • Data-informed evaluation potential
Scenario architecture linking planning tools with immersive VR training environments

Scenario architecture linking planning tools with immersive VR training environments.

Impact

15,000+

users across Ontario

Adopted within healthcare and public safety training contexts.

User scale reflects real world deployment following platform launch.

Reflection

Designing for XR amplifies ambiguity. Interaction friction and unclear information architecture become significantly more disruptive in spatial environments.

What this project reinforced for me is that immersive systems still depend on clear structural logic. Scenarios, roles, and interactions must be modeled explicitly before they can become believable experiences.

This work strengthened my ability to translate complex learning objectives into structured systems that balance immersion, usability, and technical constraints.