Special Issue of Personal and Ubiquitous Computing

 

 

MULTIMODAL INTERACTION WITH MOBILE AND WEARABLE DEVICES

 

 

Editors: Stephen Brewster, University of Glasgow, UK and Matt Jones, University of Waikato, New Zealand

 

 

Mobile devices have been one of the major success stories of computing in recent years. Large numbers of people now carry a sophisticated computing device with them all the time in the form of a mobile telephone or a personal digital assistant. More sophisticated and powerful wearable computers are becoming available and in the future will be embedded into the clothes and accessories a user wears; these wearable computers have more power and flexibility for use in different situations and with access to different media and services.

 

One of the key problems for interaction with mobile and wearable devices is their impoverished user interface when compared desktop computers. Screens are often small or non existent and head-mounted displays are not always an option, so output is limited. Small keyboards or touch screens can be hard to use for input when on the move, making interaction hard.To make mobile and wearable devices more effective and acceptable to users, and to allow a whole new range of services to be delivered to people when on the move, the challenges of input and output must be addressed. Multimodal interaction is one way to enhance usability.

 

The combination of vision, hearing and speech, and touch and gesture have great possibilities for increasing the bandwidth of communication between user and device (taste and smell are potential future contributors too). Flexible multimodal interfaces may allow users to interact in a more natural manner. This is particularly important for mobile and wearable devices as they are used by a wide range of different people in a wide range of different situations. They also have great possibilities for users with disabilities as they provide alternatives to the standard GUI model which can be problematic.

 

Key issues remain about how the different senses should be used, what each sense is good for, how they should be combined and how to assess their performance in a mobile setting. For this special issue we are keen to gather papers on the state of the art in multimodal interface design for mobile and wearable devices to try and answer some of these questions and to look towards the future of multimodal interface design.

 

 

Themes

We are soliciting papers that discuss novel multimodal techniques, methods, models and tools to overcome the impoverished interfaces of the current generation of mobile devices. We would like papers that bring together the following sorts of issues in the mobile and wearable context:-

 

·        Auditory interfaces using speech and non-speech sounds

·        Role and efficacy of speech recognition

·        Gestural, graspable, tactile, haptic and tangible interfaces

·        Vision based interaction (e.g. navigation through glancing)

·        Physiological input/output

 

·        Effective combination of multiple modalities (both in theory and practice)

·        Evaluation of multimodal systems

·        Multimodal interfaces for disabled users

·        Novel sensors and output devices to facilitate multimodal interaction

·        Novel mobile services using multimodal interfaces

 

This list is not exclusive; we are keen to receive papers on any novel combination of modalities.

 

Submission format and schedule

Submissions should be e-mailed as a PDF file (including images) directly to the editors of the special issue. Information regarding journal submissions and formats is available at: http://www.personal-ubicomp.com/

 

The deadline for receiving submissions is 26th January, 2004. All contributions will be peer reviewed to the journal’s usual standard. We encourage potential authors to contact the editors well before the final deadline.

 

For further information or to discuss a possible contribution, please contact the special issue editors, Steve Brewster (stephen@dcs.gla.ac.uk) and Matt Jones (always@acm.org).

 

 

Deadlines.

Paper Submission: 26th January, 2004

Notification of acceptance: 9th April, 2004

Final Corrections to papers: 9th July, 2004

Publication of special issue: September 2004