Integrating Gestures Into the User Interface Management System


Book Description

Abstract: "This paper describes a gesture as an alternative for users to convey their commands to the user interface, and discusses the representations of a gesture and the recognition algorithm. The system integrates gesture commands into direct manipulation, so that issuing a command and manipulating an object can be done in one stroke. The idea of high-level representations of gestures and the algorithm for detecting a circle are our contributions to gesture recognition. The advantage of our representations is that the number of the representations required to represent a gesture is small (2 to 4 for the 26 small letters and the 10 digits), so we can save storage space as well as time for the look-up process. The advantage of our detecting alogrithm is that it is good at handling circular gestures. Since many letters and digits contain a circle-like component when hand written, successful recognition of a circle or near-circle increases the success rate of gestural recognition. The system can be trained by each end user to accept variations for any gesture. The number of samples required is small."




Integrating gesture and snapping into a user interface toolkit


Book Description

Abstract: "This paper describes Artkit -- the Arizona Retargetable Toolkit -- an extensible object-oriented user interface toolkit. Artkit provides an extensible input model which is designed to support a wider range of interaction techniques than conventional user interface toolkits. In particular the system supports the implementation of interaction objects using dragging, snapping (or gravity fields), and gesture (or handwriting) inputs. Because these techniques are supported directly by the toolkit it is also possible to create interactions that mix these techniques within a single interface or even a single interactor object."




Brave NUI World


Book Description

Brave NUI World is the first practical guide for designing touch- and gesture-based user interfaces. Written by the team from Microsoft that developed the multi-touch, multi-user Surface® tabletop product, it introduces the reader to natural user interfaces (NUI). It gives readers the necessary tools and information to integrate touch and gesture practices into daily work, presenting scenarios, problem solving, metaphors, and techniques intended to avoid making mistakes. This book considers diverse user needs and context, real world successes and failures, and the future of NUI. It presents thirty scenarios, giving practitioners a multitude of considerations for making informed design decisions and helping to ensure that missteps are never made again. The book will be of value to game designers as well as practitioners, researchers, and students interested in learning about user experience design, user interface design, interaction design, software design, human computer interaction, human factors, information design, and information architecture. Provides easy-to-apply design guidance for the unique challenge of creating touch- and gesture-based user interfaces Considers diverse user needs and context, real world successes and failures, and a look into the future of NUI Presents thirty scenarios, giving practitioners a multitude of considerations for making informed design decisions and helping to ensure that missteps are never made again




Human Interface and the Management of Information. Information and Knowledge in Applications and Services


Book Description

The two-volume set LNCS 8521 and 8522 constitutes the refereed proceedings of the Human Interface and the Management of Information thematic track, held as part of the 16th International Conference on Human-Computer Interaction, HCII 2014, held in Heraklion, Greece, in June 2014, jointly with 13 other thematically similar conferences. The total of 1476 papers and 220 posters presented at the HCII 2014 conferences were carefully reviewed and selected from 4766 submissions. These papers address the latest research and development efforts and highlight the human aspects of design and use of computing systems. The papers accepted for presentation thoroughly cover the entire field of human-computer interaction, addressing major advances in knowledge and effective use of computers in a variety of application areas. This volume contains papers addressing the following major topics: e-learning and e-education; decision support; information and interaction in aviation and transport; safety, security and reliability; communication, expression and emotions; art, culture and creativity; information and knowledge in business and society.




Human-Computer Interaction


Book Description

The theme of the 1997 INTERACT conference, 'Discovering New Worlds ofHCI', signals major changes that are taking place with the expansion of new technologies into fresh areas of work and leisure throughout the world and new pervasive, powerful systems based on multimedia and the internet. HCI has a vital role to play in these new worlds, to ensure that people using the new technologies are empowered rather than subjugated to the technology that they increasingly have to use. In addition, outcomes from HCI research studies over the past 20 years are now finding their way into many organisations and helping to improve and enhance work practices. These factors have strongly influenced the INTERACT'97 Committee when creating the conference programme, with the result that, besides the more traditional HCI research and education focus found in previous INTERACT conferences, one strand of the 1997 conference has been devoted to industry and another to multimedia. The growth in the IFIP TCI3 committee itself reflects the expansion ofHCI into new worlds. Membership oflFIP TC13 has risen to now include representatives of 24 IFIP member country societies from many parts of the world. In 1997, IFIP TCl3 breaks new ground by holding its sixth INTERACT conference in the Asia-Pacific region. This is a significant departure from previous INTERACT conferences, that were all held in Europe, and is especially important for the Asia-Pacific region, as HCI expands beyond its traditional base.




Readings in Intelligent User Interfaces


Book Description

This is a compilation of the classic readings in intelligent user interfaces. This text focuses on intelligent, knowledge-based interfaces, combining spoken language, natural language processing, and multimedia and multimodal processing.







The Universal Access Handbook


Book Description

In recent years, the field of Universal Access has made significant progress in consolidating theoretical approaches, scientific methods and technologies, as well as in exploring new application domains. Increasingly, professionals in this rapidly maturing area require a comprehensive and multidisciplinary resource that addresses current principles




Integrating Gesture Recognition and Direct Manipulation


Book Description

Abstract: "A gesture-based interface is one in which the user specifies commands by simple drawings, typically made with a mouse or stylus. A single intuitive gesture can simultaneously specify objects, an operation, and additional parameters, making gestures more powerful than the 'clicks' and 'drags' of traditional direct-manipulation interfaces. However, a problem with most gesture-based systems is that an entire gesture must be entered and the interaction completed before the system responds. Such a system makes it awkward to use gestures for operations that require continuous feedback. GRANDMA, a tool for building gesture-based applications, overcomes this shortcoming through two methods of integrating gesturing and direct manipulation. First, GRANDMA allows views that respond to gesture and views that respond to clicks and drags (e.g. widgets) to coexist in the same interface. More interestingly, GRANDMA supports a new two-phase interaction technique, in which a gesture collection phase is immediately followed by a manipulation phase. In its simplest form, the phase transition is indicated by keeping the mouse still while continuing to hold the button down. Alternatively, the phase transition can occur once enough of the gesture has been seen to recognize it unambiguously. The latter method, called eager recognition, results in a smooth and natural interaction. In addition to describing how GRANDMA supports the integration of gesture and direct manipulation, this paper presents an algorithm for creating eager recognizers from example gestures."




Designing for Gesture and Tangible Interaction


Book Description

Interactive technology is increasingly integrated with physical objects that do not have a traditional keyboard and mouse style of interaction, and many do not even have a display. These objects require new approaches to interaction design, referred to as post-WIMP (Windows, Icons, Menus, and Pointer) or as embodied interaction design. This book provides an overview of the design opportunities and issues associated with two embodied interaction modalities that allow us to leave the traditional keyboard behind: tangible and gesture interaction. We explore the issues in designing for this new age of interaction by highlighting the significance and contexts for these modalities. We explore the design of tangible interaction with a reconceptualization of the traditional keyboard as a Tangible Keyboard, and the design of interactive three-dimensional (3D) models as Tangible Models. We explore the design of gesture interaction through the design of gesture-base commands for a walk-up-and-use information display, and through the design of a gesture-based dialogue for the willful marionette. We conclude with design principles for tangible and gesture interaction and a call for research on the cognitive effects of these modalities.