Abstract
AbstractEnvironment sensing and recognition can allow humans and robots to dynamically adapt to different walking terrains. However, fast and accurate visual perception is challenging, especially on embedded devices with limited computational resources. The purpose of this study was to develop a novel pair of AI-powered smart glasses for onboard sensing and recognition of human-robot walking environments with high accuracy and low latency. We used a Raspberry Pi Pico microcontroller and an ArduCam HM0360 low-power camera, both of which interface with the eyeglass frames using 3D-printed mounts that we custom-designed. We trained and optimized a lightweight and efficient convolutional neural network using a MobileNetV1 backbone to classify the walking terrain as either indoor surfaces, outdoor surfaces (grass and dirt), or outdoor surfaces (paved) using over 62,500 egocentric images that we adapted and manually labelled from the Meta Ego4D dataset. We then compiled and deployed our deep learning model using TensorFlow Lite Micro and post-training quantization to create a minimized byte array model of size 0.31MB. Our system was able to accurately predict complex walking environments with 93.6% classification accuracy and had an embedded inference speed of 1.5 seconds during online experiments using the integrated camera and microcontroller. Our AI-powered smart glasses open new opportunities for visual perception of human-robot walking environments where embedded inference and a low form factor is required. Future research will focus on improving the onboard inference speed and miniaturization of the mechatronic components.
Publisher
Cold Spring Harbor Laboratory
Reference26 articles.
1. J. Engel et al., “Project Aria: A new tool for egocentric multi-modal AI research,” arXiv, Oct. 1, 2023.
2. L. Novo-Torres , J.-P. Ramirez-Paredes and D. J. Villarreal , “Obstacle recognition using computer vision and convolutional neural networks for powered prosthetic leg applications”, in 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 3360–3363, Jul. 2019.
3. R. L. da Silva , N. Starliper , B. Zhong , H. H. Huang , and E. Lobaton , “Evaluation of embedded platforms for lower limb prosthesis with visual sensing capabilities.” arXiv, Jun. 26, 2020.
4. “Depth vision-based terrain detection algorithm during human locomotion;IEEE Trans. Med. Robot. Bionics,2022
5. K. Karacan , J. T. Meyer , H. I. Bozma , R. Gassert , and E. Samur , “An environment recognition and parameterization system for shared-control of a powered lower-limb exoskeleton,” in 2020 8th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob), pp. 623–628, Nov. 2020.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献