Novel lightweight odometric learning method for humanoid robot localization

Saeed Saeedvand, Hadi S. Aghdasi*, Jacky Baltes

*此作品的通信作者

研究成果: 雜誌貢獻期刊論文同行評審

11 引文 斯高帕斯(Scopus)

摘要

Odometry as one of the inevitable parts of robot behavior control plays an essential role in the localization of humanoid robots. Calculating odometry on the humanoid robots is usually based on one or combination of the vision, laser scanner, magnetic sensor, and pressure sensors. Vision or laser scanner-based approaches require high computational power for analyzing visual information. Hence, these kinds of approaches are not suitable for all kind of the small humanoid robots. On the other hand, it is known that magnetic sensors have instability problems in different environments. Furthermore, calculating accurate dead reckoning (pure odometry) is very difficult because of the complex mechanical system of humanoid robots, the presence of many sources of uncertainty, and inaccuracy in motion executions such as foot slippage. Therefore, this paper presents a robust learning method to localize a humanoid robot named Lightweight Humanoid robot Odometric Learning method (LHOL). This method does not employ any vision, magnetic, additional pressure sensors, and laser scanners, and, therefore, eliminates dependencies to these sensors for the first time. The method's core learning is based on the artificial neural network (ANN) which uses kinematic computations, IMU (roll and pitch data), and the robot's actuators’ internal present load data as input data. The proposed LHOL method proves high accuracy on a novel fully 3D printed kid-sized humanoid robot platform (ARC) with both open-loop and closed-loop walk engines at differently covered floors.

原文英語
頁(從 - 到)38-53
頁數16
期刊Mechatronics
55
DOIs
出版狀態已發佈 - 2018 11月

ASJC Scopus subject areas

  • 控制與系統工程
  • 機械工業
  • 電腦科學應用
  • 電氣與電子工程

指紋

深入研究「Novel lightweight odometric learning method for humanoid robot localization」主題。共同形成了獨特的指紋。

引用此