R3LIVE (Robust, Real-time, RGB-colored) real-time robust tightly coupled system based on Livox lidar-inertial-vision fusion.
R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, it’s developed by MaRS of the university of Hongkong, which is led by Prof. Fu Zhang. It takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation.
R3LIVE is built upon their previous work R2LIVE, which contains two sub-systems: the LiDAR-inertial odometry (LIO) and the visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometric structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points).
In this project, the author Dr. Lin Jiarong built a handheld 3D scanning system with Livox Avia LiDAR, industrial camera, and RoboMaster Manifold 2C;Livox Avia's non-repetitive scanning 3D point cloud information, the reconstructed map could obtain a denser point cloud map, which significantly improved the robustness of map feature matching. In addition, the author also used the built-in IMU chip of Livox Avia, and relied on the high-frequency (200Hz) six-degree-freedom motion information outputted by it synchronously to effectively improve the positioning accuracy of the tightly coupled algorithm.
R3LIVE system architecture diagram
Application of Livox Avia in R3LIVE system
In the R3LIVE experiment, the author used Livox Avia as a depth sensor to collect data in the campuses of the University of Hong Kong and the Hong Kong University of Science and Technology to verify the robustness and accuracy of the algorithm. By making full use of the Livox Avia's non-repetitive scanning 3D point cloud information, the reconstructed map could obtain a denser point cloud map, which significantly improved the robustness of map feature matching. In addition, the author also used the built-in IMU chip of Livox Avia, and relied on the high-frequency (200Hz) six-degree-freedom motion information outputted by it synchronously to effectively improve the positioning accuracy of the tightly coupled algorithm.
The authors of R3LIVE have published a total of 9 datasets that they have collected. Users can visit the following URL to download the datasets to reproduce and evaluate the experimental effect of R3LIVE: https://github.com/ziv-lin/r3live_dataset
Results of several of our experiments
R3LIVE is a highly scalable system. In addition to being used as a SLAM system for real-time robot applications, it could also be used for reconstructing dense and accurate RGB color 3D maps for applications such as surveying and mapping. Moreover, the developers of R3LIVE also provided a series of utility applications for reconstructing and rendering polygon grid maps (mesh), so that the maps reconstructed by R3LIVE can be imported into various 3D applications including games and simulators more conveniently and efficiently, further improving the scalability of R3LIVE.
Livox welcomes university laboratories and research teams to reach out for more in-depth discussions on Livox LiDAR applications. We believe that our cost-effective LiDAR solutions will empower the research, application, and progress of related fields!
The Mechatronics and Robotic Systems (MaRS) laboratory of the University of Hong Kong is a laboratory led by Professor Zhang Fu focusing on electromechanical systems and robotic applications. At present, the team has achieved great success in various fields such as aerobot design, planning and control, and SLAM applications.