Handwriting Velcro

Author:

Fang Fengyi1ORCID,Zhang Hongwei2ORCID,Zhan Lishuang2ORCID,Guo Shihui2ORCID,Zhang Minying3ORCID,Lin Juncong2ORCID,Qin Yipeng4ORCID,Fu Hongbo5ORCID

Affiliation:

1. Xiamen University, Xiamen, China and Tsinghua University, Shenzhen, China

2. Xiamen University, Xiamen, China

3. Alibaba Group, Hangzhou, China

4. Cardiff University, Cardiff, United Kingdom

5. City University of Hong Kong, Hong Kong, China

Abstract

Text input is a desired feature for AR glasses. While there already exist various input modalities (e.g., voice, mid-air gesture), the diverse demands required by different input scenarios can hardly be met by the small number of fixed input postures offered by existing solutions. In this paper, we present Handwriting Velcro, a novel text input solution for AR glasses based on flexible touch sensors. The distinct advantage of our system is that it can easily stick to different body parts, thus endowing AR glasses with posture-adaptive handwriting input. We explored the design space of on-body device positions and identified the best interaction positions for various user postures. To flatten users' learning curves, we adapt our device to the established writing habits of different users by training a 36-character (i.e., A-Z, 0-9) recognition neural network in a human-in-the-loop manner. Such a personalization attempt ultimately achieves a low error rate of 0.005 on average for users with different writing styles. Subjective feedback shows that our solution has a good performance in system practicability and social acceptance. Empirically, we conducted a heuristic study to explore and identify the best interaction Position-Posture Correlation. Experimental results show that our Handwriting Velcro excels similar work [6] and commercial product in both practicality (12.3 WPM) and user-friendliness in different contexts.

Funder

National Natural Science Foundation of China

the Royal Society

the Fundamental Research Funds for the Central Universities

the City University of Hong Kong

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Networks and Communications,Hardware and Architecture,Human-Computer Interaction

Reference86 articles.

1. Text Entry in Virtual Environments using Speech and a Midair Keyboard

2. Jiban Adhikary and Keith Vertanen . 2021 . Typing on Midair Virtual Keyboards: Exploring Visual Designs and Interaction Styles. In IFIP Conference on Human-Computer Interaction. Springer, 132--151 . Jiban Adhikary and Keith Vertanen. 2021. Typing on Midair Virtual Keyboards: Exploring Visual Designs and Interaction Styles. In IFIP Conference on Human-Computer Interaction. Springer, 132--151.

3. Gaze-Assisted Typing for Smart Glasses

4. Selection and Manipulation Methods for a Menu Widget on the Human Forearm

5. Huffman Base-4 Text Entry Glove (H4 TEG)

Cited by 9 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Touch-n-Go: Designing and Fabricating Touch Fastening Structures by FDM 3D Printing;Proceedings of the CHI Conference on Human Factors in Computing Systems;2024-05-11

2. Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in AR;IEEE Transactions on Visualization and Computer Graphics;2024-05

3. Handwriting for Text Input and the Impact of XR Displays, Surface Alignments, and Sentence Complexities;IEEE Transactions on Visualization and Computer Graphics;2024-05

4. Skin-triggered electrochemical touch sensation for self-powered human-machine interfacing;Sensors and Actuators B: Chemical;2024-05

5. WristSketcher: Creating 2D Dynamic Sketches in AR With a Sensing Wristband;International Journal of Human–Computer Interaction;2024-01-22

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3