- Title:
- Pen + Touch = New Tools
- Reference Information:
- Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, and Bill Buxton. 2010. Pen + touch = new tools. In <em>Proceedings of the 23nd annual ACM symposium on User interface software and technology</em> (UIST '10). ACM, New York, NY, USA, 27-36. DOI=10.1145/1866029.1866036 http://doi.acm.org/10.1145/1866029.1866036
- UIST 2010 New York, New York.
- Author Bios:
- Ken Hinckley has had 51 papers published by the ACM in the last 20 years. He is affiliated with both the University of Virginia as well as Carnegie Mellon University and has also worked with Microsoft Research.
- Koji Yatani is finishing up his Ph.D. this summer at the University of Toronto and will be working at Microsoft Research beginning this fall. His interests include mobile devices and hardware for sensing technologies.
- Michel Pahud is a Senior RSDE at Microsoft Research which he joined in 2000. He received his Ph.D. in parallel computing from the Swiss Federal Institute of Technology and has won the LOGITECH prize for industrially-oriented innovative hardware/software senior project.
- Nicole Coddington has worked with Microsoft Research. Coddington has 2 publications through the ACM, with similar fellow authors dealing with bi-manual input.
- Jerry Rodenhouse currently holds a position as an Experience Designer II at the Interactive Entertainment Division at Microsoft. He received his Bachelor of Industrial Design from Syracuse University in 2008.
- Andrew (Andy) Wilson is a senior researcher at Microsoft Research. Wilson received his Ph.D. at the MIT Media Laboratory and researches new gesture-related input techniques.
- Hrvoje Benko received his Ph.D. in Computer Science from Columbia University in 2007 and has more than 25 conference papers published. Benko researches novel interactive computing technologies.
- William (Bill) Buxton became the third recipient of the Canadian Human-Computer Communications Society Award in 1995, among many other awards. He completed his Master's of Science in Computer Science at the University of Toronto. He is currently the Principle Researcher at Microsoft Research.
- Summary
- Hypothesis:
- Embracing a wide range of both unimodal as well as bimodal input modes will declutter the User Interface by requiring fewer persistent items.
- Trying multiple ideas help expose flaws as well as insights into their effectiveness.
- Methods
- Prior to creating a prototype, researchers asked participants to make a scrapbook using physical items that were placed in front of them. Researchers observed their actions and attempted to categorize them for development of software. The hardware of the prototype constructed utilized a Microsoft Surface combined with an infrared pen. The software was developed after analyzing patterns that emerged during the physical scrapbook construction. The system supported operations such as stapling, cutting pictures and a bezel menu to handle creation of new objects (such as sticky notes).
- Results
- This study revealed an approach that was fluid in such a way that input never got stuck in a specific (e.g. tool) mode. Similarly, while the range of options for a given action were generally large, the default action of drawing with the pen could always be accomplished by lifting the non-dominant hand off of the screen. A couple users specifically commented about the ease of using the system, that the gestures were natural and what they would expect to physically perform. Feedback provided by the system about the current action may aid some users, particularly novice ones.
- Contents
- Touch-based input systems are continually growing, as is their user base. Limitations in current systems make them feel unnatural thus making them difficult to interact with and use. The research in this paper tests new techniques that utilize both a bare hand as well as a digital pen for input devices. A complaint received by the researchers related to the fact that some of their input gestures conflicted with gestures they had become accustomed to (through products such as smart phones). Simply because the user is unaccustomed to the gesture does not mean that it is not still natural in a sense. This research is simply meant to expose possible new ideas, and suggest further combinations to be investigated in the future.
- Discussion
- The researchers had a modest goal, simply to try out a variety of different bimodal input gestures, which they accomplished. Their idea to study actions in a physical situation, attempt to classify them and then use that information to develop virtual gestures would appear to be how more input systems should be developed. The more natural a system is to use, relative to habits humans already have, the less awkward interaction will be and the more beneficial and efficient the entire experience will be. I particularly like the bimodal aspect of input, as it would seemingly greatly improve the speed at which tasks can be accomplished. Observing the participants build the physical scrapbook showed, not surprisingly, that they utilized both hands as they've surely done for most tasks their entire lives. Even watching professional graphic designers currently reveals how quickly work can get down when an effective bimodal input system utilized.
Picture Source: "Pen + Touch = New Tools"
No comments:
Post a Comment