LU Jiawei, XU Zhe. Visual inertial positioning method based on tight coupling[J]. GNSS World of China, 2021, 46(1): 36-42. DOI: 10.12265/j.gnss.2020082801
Citation: LU Jiawei, XU Zhe. Visual inertial positioning method based on tight coupling[J]. GNSS World of China, 2021, 46(1): 36-42. DOI: 10.12265/j.gnss.2020082801

Visual inertial positioning method based on tight coupling

  • The inertial measurement unit (IMU) is disturbed by its own temperature, bias, vibration and other factors, so the pose is easy to diverge when integrating, and the monocular vision positioning accuracy is poor when the robot moves rapidly. Therefore, this paper studies a visual inertial synchronous simultaneous localization and mapping (SLAM) method based on tight coupling. Firstly, the location problem of visualodometry (VO) is studied. In order to reduce the mismatching of feature points, the feature points extraction method based on Oriented FAST and Rotated BRIEF (ORB) is adopted. Then the mathematical model of IMU is constructed, and the discrete integral of the motion model is obtained by using the median method. Finally, the pose of monocular vision is aligned with IMU trajectory, and the optimal state estimation of robot motion is obtained by nonlinear optimization based on sliding window. The two experiments were verified by constructing the simulation scene ard comparing with the monocular ORB-SLAM algorithm. The results show that the proposed method is better than visual odometer alone, and the positioning accuracy is controlled at about 0.4 m, which is 30% higher than the traditional tracking model.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return