8 ORB_SLAM2 basics

Official website: http://webdiis.unizar.es/~raulmur/orbslam/

ASL Dataset: https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets

mono Dataset: https://vision.in.tum.de/data/datasets/rgbd-dataset/download

stereo Dataset: http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/machine_hall/MH_01_easy/

orb_slam2_ros: http://wiki.ros.org/orb_slam2_ros

ORB-SLAM: https://github.com/raulmur/ORB_SLAM

ORB-SLAM2: https://github.com/raulmur/ORB_SLAM2

ORB-SLAM3: https://github.com/UZ-SLAMLab/ORB_SLAM3

8.1 Introduction

ORB-SLAM is mainly used for monocular SLAM;

The ORB-SLAM2 version supports monocular, binocular and RGBD interfaces;

The ORB-SLAM3 version adds IMU coupling and supports fisheye cameras.

All steps of ORB-SLAM uniformly use the ORB features of the image. ORB feature is a very fast feature extraction method, which is rotation invariant and can use pyramids to build scale invariance. The use of unified ORB features helps the SLAM algorithm to be consistent in the steps of feature extraction and tracking, key frame selection, 3D reconstruction, and closed-loop detection. The system is also robust to vigorous motion, supporting wide-baseline closed-loop detection and relocalization, including fully automatic initialization. Since the ORB-SLAM system is a SLAM system based on feature points, it can calculate the trajectory of the camera in real time and generate sparse 3D reconstruction results of the scene.

On the basis of ORB-SLAM, ORB-SLAM2 contributes points:

  1. The first open-source SLAM system for monocular, binocular and RGBD cameras, including loopback and relocalization and map reuse.
  2. The results of RGBD show that more accuracy can be obtained by using BA than ICP or minimization based on photometric and depth errors.
  3. By using the far and near points in the binocular, as well as the monocular observation, the binocular result is more accurate than the direct binocular SLAM algorithm.
  4. The light positioning mode can effectively reuse the map.

ORB-SLAM2 contains modules common to all SLAM systems: Tracking, Mapping, Relocalization, and Loop closing. .The following figure is the flow of ORB-SLAM2.

orbslam_all

8.2 Official case

Open the terminal and enter ORB_SLAM2

8.2.1 Monocular test

image-20220225145646590

The blue frame is the key frame, the green frame is the camera orientation, the black point is the saved point, and the red point is the point currently seen by the camera.

After the test, the key frame is saved to the KeyFrameTrajectory.txt file in the current directory

8.2.2 binocular test

image-20220225150328232

The blue frame is the key frame, the green frame is the camera orientation, the black point is the saved point, and the red point is the point currently seen by the camera.

After the test, the keyframes are saved to the CameraTrajectory.txt file in the current directory

8.2.3 RGBD test

Combine depth data and color map data into rgbd data and save it to associations.txt file

test command

image-20220225151139117

8.3 ORB_SLAM2_ROS camera test

The internal parameters of the camera have been modified before the product leaves the factory. If you want to learn the method, see the section [8.3.1, Internal Parameters Modification]. It can be hand-held or a robot can be used as a mobile carrier for mobile testing.

If you hold it, you don't need to execute the next command, otherwise, execute it. (Robot side)

Start the camera ORB_SLAM2 test(Robot or virtual machine)

8.3.1 Internal reference modification

The camera needs the internal parameters of the camera before running ORBSLAM, so the camera must be calibrated first. The specific method can be seen in the lesson [02, Astra Camera Calibration].

Start the monocular camera

Start the calibration node

After calibration, move the [calibrationdata.tar.gz] file to the [home] directory.

After decompression, open [ost.yaml] in the folder and find the camera internal parameter matrix, for example: the following content.

The camera's internal parameter matrix

Modify the data in data to the values corresponding to [astra.yaml] and [astra1.0.yaml] in the [param] folder under the [yahboomcar_slam] function package.

8.3.2 Monocular

image-20220225155006242

When the command is executed, there is only a green frame in the [ORB_SLAM2:Map Viewer] interface, and the [ORB_SLAM2:Current Frame] interface is trying to initialize. At this time, slowly move the camera up and down, left and right, find the feature points in the screen and initialize the slam.

image-20220225155453650

As shown in the figure above, enter the [SLAM MODE] mode at this time. When running the monocular, you must continuously obtain each frame of image to position the camera. If you select the pure positioning mode in the upper left figure [Localization Mode], the camera will not be able to find its own position. You have to start over to get keyframes.

8.3.3 Monocular AR

image-20220225161118946

When the command is executed, there is only one interface and it displays [slam not initialized] The slam is not initialized. Click the box on the left of [Draw Points] in the left column to display the feature points. At this time, slowly move the camera up, down, left and right, find the feature points in the picture and initialize the slam.

image-20220225161856504

As shown in the figure above, enter the [SLAM ON] mode at this time, and click the [Insert Cube] screen to insert an AR cube in the place that is considered to be a plane. And the AR block will always be in a fixed position in the scene. not the fixed position of the camera screen. Click【Clear All】to clear.

image-20220225161819953

8.3.4 RGBD

image-20220225163133305

RGBD does not have to continuously acquire each frame of image like running a monocular. If you select the pure positioning mode in the upper left picture [Localization Mode], you can locate the key frame just acquired.