Author:
Ye Yaozu,Wang Kaiwei,Hu Weijian,Li Huabing,Yang Kailun,Sun Lei,Chen Zuobing
Abstract
Abstract
There are a very few people who have the ability to “see” the surroundings by the echoes, which is called echolocation. The study of the brain mechanism of echolocation can not only help to improve the blind assistance device, but also provides a window into the research of brain’s plasticity. In this paper, we developed a wearable system to transform the spatial information captured by camera into a voice description and fed it back to blind users which is inspired by echolocation. After our online virtual scene training, users can easily discriminate object location in the camera’s view, motion of the objects, even shape of the objects. Compared with natural echolocation, it’s easier to learn and be applied in daily life. In addition, the device achieves high spacial resolution. In this study, two trained blind subjects and two non-trained sighted subjects were tested by using functional Magnetic Resonance Imaging (fMRI). We obtain the fMRI images of the subjects’ brain activity when they were listening to the sound of the wearable prototype. Intriguingly, we find that after training with the blind assistance system, the blind’ visual area of the brain have been activated when they are dealing with the acoustic feedback from the device.
Subject
General Physics and Astronomy
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献