ReMap: Lowering the Barrier to Help-Seeking with Multimodal Search

ACM Symposium on User Interface Software and Technology (UIST)

Published October 20, 2020

Ailie Fraser, Julia M. Markel, N. James Basa, Mira Dontcheva, Scott Klemmer

People often seek help online while using complex software. Currently, information search takes users' attention away from the task at hand by creating a separate search task. This paper investigates how multimodal interaction can make in-task help-seeking easier and faster. We introduce ReMap, a multimodal search interface that helps users find video assistance while using desktop and web applications. Users can speak search queries, add application-specific terms deictically (e.g., "how to erase this"), and navigate search results via speech, all without taking their hands (or mouse) off their current task. Thirteen participants who used ReMap in the lab found that it helped them stay focused on their task while simultaneously searching for and using learning videos. Users' experiences with ReMap also raised a number of important challenges with implementing system-wide context-aware multimodal assistance.

Learn More

Research Area:  Human Computer Interaction