Development of Gesture-based Commands for Natural User Interfaces

Author:

Bailey Shannon K. T.1,Whitmer Daphne E.1,Schroeder Bradford L.1,Sims Valerie K.1

Affiliation:

1. University of Central Florida, Orlando, Florida

Abstract

Human-computer interfaces are changing to meet the evolving needs of users and overcome limitations of previous generations of computer systems. The current state of computers consists largely of graphical user interfaces (GUI) that incorporate windows, icons, menus, and pointers (WIMPs) as visual representations of computer interactions controlled via user input on a mouse and keyboard. Although this model of interface has dominated human-computer interaction for decades, WIMPs require an extra step between the user’s intent and the computer action, imposing both limitations on the interaction and introducing cognitive demands (van Dam, 1997). Alternatively, natural user interfaces (NUI) employ input methods such as speech, touch, and gesture commands. With NUIs, users can interact directly with the computer without using an intermediary device (e.g., mouse, keyboard). Using the body as an input device may be more “natural” because it allows the user to apply existing knowledge of how to interact with the world (Roupé, Bosch-Sijtsema, & Johansson, 2014). To utilize the potential of natural interfaces, research must first determine what interactions can be considered natural. For the purpose of this paper, we focus on the naturalness of gesture-based interfaces. The purpose of this study was to determine how people perform natural gesture-based computer actions. To answer this question, we first narrowed down potential gestures that would be considered natural for an action. In a previous study, participants ( n=17) were asked how they would gesture to interact with a computer to complete a series of actions. After narrowing down the potential natural gestures by calculating the most frequently performed gestures for each action, we asked participants ( n=188) to rate the naturalness of the gestures in the current study. Participants each watched 26 videos of gestures (3-5 seconds each) and were asked how natural or arbitrary they interpreted each gesture for the series of computer commands (e.g., move object left, shrink object, select object, etc.). The gestures in these videos included the 17 gestures that were most often performed in the previous study in which participants were asked what gesture they would naturally use to complete the computer actions. Nine gestures were also included that were created arbitrarily to act as a comparison to the natural gestures. By analyzing the ratings on a continuum from “Completely Arbitrary” to “Completely Natural,” we found that the natural gestures people produced in the first study were also interpreted as the intended action by this separate sample of participants. All the gestures that were rated as either “Mostly Natural” or “Completely Natural” by participants corresponded to how the object manipulation would be performed physically. For example, the gesture video that depicts a fist closing was rated as “natural” by participants for the action of “selecting an object.” All of the gestures that were created arbitrarily were interpreted as “arbitrary” when they did not correspond to the physical action. Determining how people naturally gesture computer commands and how people interpret those gestures is useful because it can inform the development of NUIs and contributes to the literature on what makes gestures seem “natural.”

Publisher

SAGE Publications

Subject

General Medicine,General Chemistry

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3