- Title:
- Sensing Foot Gestures from the Pocket
- Reference Information:
- Jeremy Scott, David Dearman, Koji Yatani, and Khai N. Truong. 2010. Sensing foot gestures from the pocket. In <em>Proceedings of the 23nd annual ACM symposium on User interface software and technology</em> (UIST '10). ACM, New York, NY, USA, 199-208. DOI=10.1145/1866029.1866063 http://doi.acm.org/10.1145/1866029.1866063
- UIST 2010 New York, New York.
- Author Bios:
- Jeremy Scott is a graduate student at the Massachusetts Institute of Technology. His undergraduate thesis was the topic of this research paper.
- David Dearman is a professor at Dalhousie University. In the last 6 years he has published 21 research papers through the ACM.
- Koji Yatani is finishing up his Ph.D. this summer at the University of Toronto and will be working at Microsoft Research beginning this fall. His interests include mobile devices and hardware for sensing technologies.
- Khai N. Truong is an Associate Professor at the University of Toronto. Truong's research is in improving the usability of mobile computing devices.
- Summary
- Hypothesis:
- The researchers hypothesize that utilizing foot gestures as input is both plausible (they can be accurately recognized) as well as socially acceptable. This paper focused primarily on the first of these two statements, with another future study planned to investigate the second.
- Methods
- The researchers conducted two small studies to complete this research paper. The first of which was to measure the plausible range of accurate foot selection motion. Additionally, the researchers used the feedback from this study to determine what moves (rotations of the foot) were the most comfortable to perform. The second study was designed to determine if a cell phone with a triple-axis accelerometer could recognize these various selection gestures when in the users pocket or mounted on their waist.
- Results
- The first study primarily showed the researchers the range of motion that potential users would be able to easily reach. The interviews after this study also revealed that rotations of the heel were the most comfortable movement to perform. The second study showed that a mobile device mounted on the side of a user was the most effective at recognizing gestures. The next most accurate position is in the user's front pocket. Researchers hypothesize that this placement is not as accurate as the side mount since the phone has a small area to move around in when placed in the pocket.
- Contents
- The research paper presents an alternative interaction method with mobile devices. This interaction method is aimed to be socially acceptable, as well as visually feedback free. This method would allow users to perform tasks on their phone, such as changing songs, without actually having to pull it out to do so. Other visual-feedback free methods are already being investigated, see my blog about Imaginary Interfaces (titled "Paper Reading #1), or already in use (such as voice commands). The goal of this investigation was to discover a new method that was accurate while avoiding being socially awkward.
- Discussion
- Immediately upon reading this article I recalled the Imaginary Interfaces paper. Both papers are essentially studying input methods that don't require visual feedback. Both have a question about accuracy, since implementing a system which had a low recognition rate would essentially defeat its own purpose. This paper is very exciting because of the fact that it requires absolutely nothing more than what a large percentage already have, a smart phone carried in a pocket. The early accuracy of this system is encouraging, the researchers have certainly shown what they set out to prove. The biggest disappointment about this paper is that they haven't performed the study in daily life yet. I am very eager to learn more from reading their follow-up paper.
Picture Source: "Sensing Foot Gestures from the Pocket"
No comments:
Post a Comment