Gesture Ideation & Testing
Based on prior experience with OpenCV, I worked closely with engineer to develop a gesture library. Goals were to use gestures that were reliably recognized from sensors, intuitive, and easy to execute.
Assumption of user as novice with likely zero experience with Leap Motion
At the time, Leap Motion was fairly new and it's gesture library was not well defined. Therefore, through tinkering and trying out simple demos, gained a familiarity of the capabilities (gesture triggers) and limits (lighting) of the technology to provide familiarity of tech.
Prior to any prototyping or designs, I used co-workers to imitate specific actions (eg zoom, pan, etc) using their hands to understand if conceptually, there were common gestures for specific acts. Given the user of this product, the learn curve had to be minimal for high user engagement
Easy to Execute
Finally, based on the tech capabilities and what gestures were intuitive, we created some low-fidelity prototypes and had users interact with them in an exploratory manner. Despite some gestures being intuitive, the execution over a prolonged time was difficult.
Using the knowledge gained from the Gesture Ideation & Testing, we began to work closely with visual designers to marry the results with the creative.
Throughout this process, there were multiple iterations at the micro (individual components) and macro (entire schematic) levels as we explored options and found problems and issues not previously encountered.
One of biggest problems encountered was users getting 'lost' in the environment. That is, we tried to give users as much freedom to explore the product however they wanted. However, we found that too much freedom resulted in a poor UX. Our solution was to limit the macro-environment to be linear (step by step) while maintaining user freedom in the micro-environment (select areas of the product).
Throughout the process, one of the challenges faced was acting as a liaison between visual designers and the engineer. Because we were utilizing a 3D Alioscopic display (glasses free), it was important to make sure the visual designs translated well to the 3D environment.
Often, the engineer would have to make changes based on an existing design and there were many working session to carry through the branding and vision into this 3D world that worked with the product objectives and the technology being used.
Once a high-fidelity prototype was available, I conducted some usability testing to validate previous research and ensure product and business goals were met.
Given the time crunch and our novice, low tech persona, I used several co-workers unfamiliar with the project and who proclaimed to be not 'tech savvy'.
Results validated our previous research on gestures and our story flow with users being able to successfully go through the experience with minor issues.
One area of frustration was users defaulting to a simple 'cursor' gesture to navigate. Therefore, to encourage using more dynamic gestures, we built in some animations and visuals as easter eggs when used to increase user engagement and provide that 'wow' factor