Mixed Reality mode with precision world anchoring and hand tracking improvese contextual understanding of instructions.
Develop an Augmented Reality (AR) capability that aids a maintainer with replacing the brake pads, brake disk, fuel filter, and water separator on a Ultra Light Tactical Vehicle (ULTV). The AR capability shall provide a reach back function in which the video feed of the maintainer’s perspective can be wirelessly live streamed from one Electronics Maintenance Support System (EMSS) to another EMSS. The EMSS shall be the host system for the AR capability. The Prototype Device shall be designed with a platform agnostic architecture, in which the capability can be expanded to maintenance actions on multiple Marine Corps platforms.
The problem at hand that the prototype device is intended to solve is a lack of ground vehicle maintenance training and overall maintenance experience. Vehicle maintainers in the Marine Corps have a wide range of varying experience, with some younger Marines having little to no experience with vehicle maintenance. To improve upon this knowledge gap, the Marine Corps is exploring AR to act as a real-time visual aid while conducting vehicle maintenance. The AR capability prototype device is intended to automatically detect vehicle components that are integral to the maintenance task at hand. The corresponding authoritative technical manuals and relevant schematics will be presented to the maintainer automatically upon component detection.
We partnered with local Polaris dealers and ULTV owners to capture the end to end brake replacement process in a working mechanics shop environment.
To create an immersive and comprehensive training experience, we developed a multi-layered solution leveraging advanced technology. A VR headset with precise head and hand tracking captured the expert mechanic's movements in real-time while they worked on the brake system in mixed reality. This motion data was coupled with audio recordings, meticulously mapped to the step-by-step brake replacement instructions. Furthermore, we employed 3D Gaussian splat spatial recordings throughout the entire process, capturing detailed spatial information at each stage. This allowed us to generate both before-and-after 3D Gaussian splats, along with contextual splats of the complete vehicle and the precise location of the headset and the side-by-side ultra-light tactical vehicle, providing unparalleled detail for analysis and learning.