Showcase

R3LIVE Real-time Robust Tightly Coupled System by MaRS Laboratory, HKU

2022-5-17

R3LIVE (Robust, Real-time, RGB-colored) real-time robust tightly coupled system based on Livox lidar-inertial-vision fusion.

R3LIVE Introduction

 

R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, it’s developed by MaRS of the university of Hongkong, which is led by Prof. Fu Zhang. It takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation.

 

R3LIVE is built upon their previous work R2LIVE, which contains two sub-systems: the LiDAR-inertial odometry (LIO) and the visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometric structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points).

 

 

R3LIVE Hardware System

 

In this project, the author Dr. Lin Jiarong built a handheld 3D scanning system with Livox Avia LiDAR, industrial camera, and RoboMaster Manifold 2C;Livox Avia's non-repetitive scanning 3D point cloud information, the reconstructed map could obtain a denser point cloud map, which significantly improved the robustness of map feature matching. In addition, the author also used the built-in IMU chip of Livox Avia, and relied on the high-frequency (200Hz) six-degree-freedom motion information outputted by it synchronously to effectively improve the positioning accuracy of the tightly coupled algorithm.

 

 

 

 

R3LIVE system architecture diagram

 

Application of Livox Avia in R3LIVE system

 

In the R3LIVE experiment, the author used Livox Avia as a depth sensor to collect data in the campuses of the University of Hong Kong and the Hong Kong University of Science and Technology to verify the robustness and accuracy of the algorithm. By making full use of  the Livox Avia's non-repetitive scanning 3D point cloud information, the reconstructed map could obtain a denser point cloud map, which significantly improved the robustness of map feature matching. In addition, the author also used the built-in IMU chip of Livox Avia, and relied on the high-frequency (200Hz) six-degree-freedom motion information outputted by it synchronously to effectively improve the positioning accuracy of the tightly coupled algorithm.

 

R3LIVE public experimental dataset

 

The authors of R3LIVE have published a total of 9 datasets that they have collected. Users can visit the following URL to download the datasets to reproduce and evaluate the experimental effect of R3LIVE: https://github.com/ziv-lin/r3live_dataset

 

 

R3LIVE project summary

 

In the work, the author innovatively introduced R3LIVE, and effectively solved the following problems:

  • A set of high-precision and high-efficiency color point cloud reconstruction system was introduced to reconstruct the dense color point cloud of the surrounding environment in real-time;
  • The problem that LiDAR cannot be positioned normally in degraded scenes was effectively solved by fusing camera information;
  • Aiming to promote research and applications in LiDAR related industries with a cost-effective solution, the author has opened source for a complete set of software and hardware solutions based on Livox Avia LiDAR;

 

 

 

 

Results of several of our experiments

 

R3LIVE is a highly scalable system. In addition to being used as a SLAM system for real-time robot applications, it could also be used for reconstructing dense and accurate RGB color 3D maps for applications such as surveying and mapping. Moreover, the developers of R3LIVE also provided a series of utility applications for reconstructing and rendering polygon grid maps (mesh), so that the maps reconstructed by R3LIVE can be imported into various 3D applications including games and simulators more conveniently and efficiently, further improving the scalability of R3LIVE.


Livox welcomes university laboratories and research teams to reach out for more in-depth discussions on Livox LiDAR applications. We believe that our cost-effective LiDAR solutions will empower the research, application, and progress of related fields!

 

 


 

MaRS Introduction

The Mechatronics and Robotic Systems (MaRS) laboratory of the University of Hong Kong is a laboratory led by Professor Zhang Fu focusing on electromechanical systems and robotic applications. At present, the team has achieved great success in various fields such as aerobot design, planning and control, and SLAM applications.

 

Papers

  • R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

Linkhttps://github.com/hku-mars/r3live

  • R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping

Linkhttps://github.com/hku-mars/r2live

  • Fast-LIO: A computationally efficient and robust LiDAR-inertial odometry package.

Linkhttps://github.com/hku-mars/FAST_LIO

  • ikd-tree: A state-of-art dynamic KD-Tree for 3D kNN search.

Linkhttps://github.com/hku-mars/ikd-Tree

  • Loam-livox: A robust LiDAR Odometry and Mapping (LOAM) package for Livox-LiDAR

Linkhttps://github.com/hku-mars/loam_livox

  • livox_camera_calib: A robust, high accuracy extrinsic calibration tool between high resolution LiDAR (e.g. Livox) and camera in targetless environment.

Linkhttps://github.com/hku-mars/livox_camera_calib

  • mlcc: A fast and accurate Multiple LiDARs and Cameras extrinsic Calibration

Linkhttps://github.com/hku-mars/mlcc