Abstract
AbstractRobotic prosthetic legs and exoskeletons require real-time and accurate estimation of the walking environment for smooth transitions between different locomotion mode controllers. However, previous studies have mainly been limited to static image classification, therein ignoring the temporal dynamics of human-robot locomotion. Motivated by these limitations, here we developed several state-of-the-art temporal convolutional neural networks (CNNs) to compare the performances between static vs. sequential image classification of real-world walking environments (i.e., level-ground terrain, incline stairs, and transitions to and from stairs). Using our large-scale image dataset, we trained a number of encoder networks such as VGG, MobileNetV2, ViT, and MobileViT, each coupled with a temporal long short-term memory (LSTM) backbone. We also trained MoViNet, a new video classification model designed for mobile and embedded devices, to further compare the performances between 2D and 3D temporal deep learning models. Our 3D network outperformed all the hybrid 2D encoders with LSTM backbones and the 2D CNN baseline model in terms of classification accuracy, suggesting that network architecture can play an important role in performance. However, although our 3D neural network achieved the highest classification accuracy, it had disproportionally higher computational and memory storage requirements, which can be disadvantageous for real-time control of robotic leg prostheses and exoskeletons with limited onboard resources.
Publisher
Cold Spring Harbor Laboratory
Reference40 articles.
1. M. Grimmer , R. Riener , C. J. Walsh , and A. Seyfarth , “Mobility related physical and functional losses due to aging and disease - A motivation for lower limb exoskeletons,” Journal of NeuroEngineering and Rehabilitation, Jan. 2019.
2. O. Tsepa , R. Burakov , B. Laschowski and A. Mihailidis , “Continuous prediction of leg kinematics during walking using inertial sensors, smart glasses, and embedded computing,” IEEE International Conference on Robotics and Automation (ICRA), May 2023.
3. N. E. Krausz and L. J. Hargrove , “Recognition of ascending stairs from 2D images for control of powered lower limb prostheses,” IEEE/EMBS Conference on Neural Engineering (NER), Apr. 2015.
4. B. Laschowski , W. McNally , A. Wong , and J. McPhee , “Preliminary design of an environment recognition system for controlling robotic lower-limb prostheses and exoskeletons,” IEEE International Conference on Rehabilitation Robotics (ICORR), Jun. 2019.
5. G. Khademi and D. Simon , “Convolutional neural networks for environmentally aware locomotion mode recognition of lower-limb amputees,” ASME Dynamic Systems and Control Conference (DSCC), Nov. 2019.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献