Author:
Kuroda Yoji, ,Suzuki Masataka,Saitoh Teppei,Terada Eisuke
Abstract
In this paper, we propose a long-range road estimation method for autonomousmobile robots in unstructured urban environments. Near-range road surface conditions are estimated by using remission value as reflectivity of a laser scanner. Graph cut algorithm is applied to estimate road region robustly also in complicated environments. Moreover, we propose a novel image segmentation method to estimate long-range road surface. A compact texture/color feature is integrated with level-set method to estimate precise road boundaries robustly. Our proposed image segmentation approach gives better performance compared with standard classification approach. Finally, we run our autonomous mobile robot in “Tsukuba Challenge 2009” and our university campus, and experimental results have shown a marked increase accuracy in road estimation over standard methods.
Publisher
Fuji Technology Press Ltd.
Subject
Electrical and Electronic Engineering,General Computer Science
Reference28 articles.
1. H. Dahlkamp, A. Kaehler, D. Stavens, S. Thrun, and G. Bradski, “Self-supervised monocular road detection in desert terrain,” Robotics: Science & Systems, 2006.
2. L. Jackel, E. Krotkov, M. Perschbacher, J. Pippine, and C. Sullivan, “The DARPA LAGR program: Goals, challenges, methodology, and Phase I results,” J. of Field Robotics, Vol.23, pp. 945-973, November/December 2006.
3. M. Yoichi, C. Alexander, E. Takeuchi, A. Aburadani, and T. Tsubouchi “Autonomous Robot Navigation in Outdoor Cluttered Pedestrian Walkways,” J. of Field Robotics, 2009.
4. S. Yuta, H. Hashimoto, and H. Tashiro, “Tsukuba Challenge – Real World Robot Challenge (RWRC): Toward actual autonomous robots in our daily life,” The 25th Annual Conf. of the Robotics Society of Japan, 3D19, 2007.
5. T. Saitoh and Y. Kuroda, “Self-Supervised Mapping for Road Shape Estimation Using Laser Remission in Urban Environments,” J. of Robotics and Mechatronics, Vol.22, No.6, pp. 726-736, Dec. 2010.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献