Panos Markopoulos
Eindhoven University of Technology
View profile | Read Abstract
March 13 - 16, 2017
Limassol, Cyprus
Shumin Zhai is a Senior Staff Research Scientist at Google. He works on fundamental and practical aspects of human-computer interaction, particularly user interface design and development informed by scientific and technology insight. From 1996 to 2010 he was a Research Staff Member at the IBM Almaden Research Center where he led product innovations and foundational user interface research. He originated and led the SHARK/ShapeWriter project and a start-up company that pioneered the touchscreen word-gesture keyboard paradigm, filing the first patents of this paradigm, publishing the first generation of scientific papers and dissertations (by his former Ph.D. student Per Ola Kristensson) of the area, publicly releasing the first word-gesture keyboard in 2004 through IBM Alphaworks, and a top ranked (6th) iPhone app called ShapeWriter WritingPad in 2008. With his team and colleagues at Google he continues to lead the state of the art of input and UI research and development.
His publications have won the ACM UIST Lasting Impact Award and a IEEE Computer Society Best Paper Award. He regularly serves on academic boards and committees and served as the 4th Editor-in-Chief of ACM Transactions on Computer-Human Interaction. He received his Ph.D. degree at the University of Toronto in 1995. In 2006, he was named one of ACM's inaugural class of Distinguished Scientists. In 2010 he was named Member of the CHI Academy and ACM Fellow.
Personal Website: www.shuminzhai.com
Word-Gesture Keyboard
Human-Computer/Information interaction is rapidly moving from a desktop model to an multi-device and information cloud model. Text is an indispensable form of information. With this background, the ShapeWriter project (with Per Ola Kristensson and ShapeWriter Inc) pioneered gesture typing on smart keyboard as a new paradigm of information input on touch screens.
FonePal
FonePal is a multi channel, multi modal and multi device solution to the “Touchtone Hell” IVR problem.
Eye-tracking augmented user interfaces
The eye, being simultaneously the mind's window to the world and a world's window to the mind, has many potential applications in human-computer interaction. Two applications I have been involved in are MAGIC Pointing (with Ihde and Morimoto) and iTourist (with P. Qvarfordt). I have argued that whenever possible the eye-gaze should be used as a contextual and implicit, rather than a direct and explicit modality of computer input.
Laws of action
In any research field establishing robust laws and regularities are fundamental to the field's development. This is considerably harder in HCI given the complexity of human performance, behavior and experience. Fortunately some perceptual-motor actions in HCI can be modeled by “Laws of Action". Contributions I have helped to make in this area include 1. Clarifying a fundamental logical error of using a compound throughput (TP) metric to characterize computer input with Fitts' law (MT = a + b ID) and suggesting that a and b separately represent the non-informational and informational aspects of pointing; 2. (With Kong and Ren) Understanding actual and nominal pointing precision in Fitts’ law tasks.3. (With J. Accot) A more rigorous understanding and formulation of 2D Fitts' law, or more precisely models of pointing with simultaneous amplitude and directional constraint. 4 (With J. Accot) The regularities in crossing actions (“more than dotting the i's). 5 (With J. Accot) The steering law, its connection to Fitts' law and its strength against movement scaling. 6 (With X. Cao) The CLC model of gesture stroke complexity.
ScrollPointTM Mouse
A common challenge to research is the lack of opportunity to make a broad and direct impact on technologies used by real users. I was fortunate to work with (and lead) a team of engineers from IBM and IBM vendors (and their vendors) to bring the ScrollPoint mouse from research to market (received a CES award and millions of users). In addition to the central function, every “peripheral” factors can also make or break a product, including cost, design (both deep and surface), packaging and install, and retrofitting its software to operating systems that are not designed to support new user interfaces. HCI researchers and practitioners with broad skills are uniquely suited to drive a user experience centered system engineering process in product development. For example, in order to balance between cost and quality, a deep user experience and human performance understanding can be applied to decisions on sensor quality, processor speed, A/D conversion resolution, and the shape and form appeal of a mouse. I hope one day to find the time to write “The tale of a mouse”.
6 DOF input control
Researchers sometimes wonder whether academic research is really useful to the larger world. This project on multiple degrees of freedom input control was rather academic and abstract when it was done (advised by P. Milgram and B. Buxton at the University of Toronto), but it has found surprising amount of practical (and academic) applications. Researchers, engineers, designers and industry executives personally tell me how they use the concepts, understanding, and methods developed in this research project in their design, development, and research of new 6DOF controllers. Key contributions of this research include the taxonomy of controller resistance (isometric, elastic, and isotonic), the relationship between transfer function and controller property (e.g. rate control's compatibility with self-centering devices), the effect of different muscle groups in 6DOF manipulation, effective evaluation tasks (6DOF docking and 6DOF tracking), and methods of quantifying coordination in multiple degrees of freedom input control. The work was driven by the need of designing telerobotics and virtual reality interfaces, but the recent wild success of the Nintendo Wii has liberated multiple DOF controllers from these specialized fields to ordinary households.