Monday, April 10, 2017

"Design and Real-World Evaluation of Eyes-Free Yoga: An Exergame for Blind and Low-Vision Exercise" by CS Assistant Professor and Public Digital Arts Cluster member Kyle Rector (et al) will appear in Volume 9 Issue 4, April 2017 of ACM Transactions on Accessible Computing (TACCESS).

 

Abstract: People who are blind or low vision may have a harder time participating in exercise due to inaccessibility or lack of encouragement. To address this, we developed Eyes-Free Yoga using the Microsoft Kinect that acts as a yoga instructor and has personalized auditory feedback based on skeletal tracking. We conducted two different studies on two different versions of Eyes-Free Yoga: (1) a controlled study with 16 people who are blind or low vision to evaluate the feasibility of a proof-of-concept and (2) an 8-week in-home deployment study with 4 people who are blind or low vision, with a fully functioning exergame containing four full workouts and motivational techniques. We found that participants preferred the personalized feedback for yoga postures during the laboratory study. Therefore, the personalized feedback was used as a means to build the core components of the system used in the deployment study and was included in both study conditions. From the deployment study, we found that the participants practiced Yoga consistently throughout the 8-week period (Average hours = 17; Average days of practice = 24), almost reaching the American Heart Association recommended exercise guidelines. On average, motivational techniques increased participant's user experience and their frequency and exercise time. The findings of this work have implications for eyes-free exergame design, including engaging domain experts, piloting with inexperienced users, using musical metaphors, and designing for in-home use cases.

 

The e-publication is available here. Additional information on "Eyes-Free Yoga" may be found here.