5.ORB_SLAM2 basics

 

1.Brief introduction

ORB-SLAM is mainly used for monocular SLAM; The ORB-SLAM2 version supports three interfaces: monocular, binocular, and RGBD; The ORB-SLAM3 version adds IMU coupling and supports fisheye cameras. All steps of ORB-SLAM uniformly use the ORB features of the image. ORB feature is a very fast feature extraction method that has rotation invariance and can be used to construct scale invariance using a pyramid. The use of unified ORB features helps SLAM algorithm achieve consistency in feature extraction and tracking, keyframe selection, 3D reconstruction, closed-loop detection, and other steps. The system is also robust to intense motion and supports wide baseline closed-loop detection and relocation, including fully automatic initialization. Due to the fact that the ORB-SLAM system is based on feature points, it can calculate the trajectory of the camera in real-time and generate sparse 3D reconstruction results of the scene. On the basis of ORB-SLAM, ORB-SLAM2 contribution points:

  1. The first open-source SLAM system for monocular, binocular, and RGBD cameras, including loopback and repositioning, as well as map reuse.
  2. The results of RGBD show that using BA can achieve more accurate results than ICP or minimization based on photometric and depth errors.
  3. By utilizing the far and near points in binoculars, as well as monocular observations, the binocular results are more accurate than the direct binocular SLAM algorithm.
  4. The light positioning mode can effectively reuse maps. ORB-SLAM2 includes modules common to all SLAM systems: Tracking, Mapping, Relocation, and Loop Closing.. The following diagram shows the process of ORB-SLAM2.

12.5.1

2. Case

2.1 Dataset location

Here we downloaded the EuRoC MH01 dataset and TUM rgbd_ Dataset_ Freiburg1_ Xyz dataset.

image-20230417111610512

If you need other datasets, you can download them at the following address:

2.2 Monocular testing

Taking the EuRoC dataset as an example, enter ORB_ SLAM2 directory:

Input following command:

After successful operation, the interface is shown in the following figure.

image-20231030153639673

The blue box represents the keyframes, the green box represents the camera orientation, the black dots represent the saved points, and the red dots represent the points currently seen by the camera.

Click on 【Show Image】 to view the image.

image-20231030153713905

After the test is completed, save the keyframes to the KeyFrameTrajectory.txt file in the current directory.

2.3 Binocular testing

Taking the EuRoC dataset as an example, enter ORB_ Run the following command in the SLAM2 directory.

Input following command:

After successful operation, the interface is shown in the following figure.

image-20231030154344371

The blue box represents the keyframe, the green box represents the camera orientation, the black point represents the saved point, and the red point represents the point currently seen by the camera.

Click on 【Show Image】 to view the image.

image-20231030154418875

After the test is completed, save the keyframes to the CameraTrajectory.txt file in the current directory.

2.4 RGBD testing

Enter ORB_ In the SLAM2 directory, input following command

Input following command:

After successful operation, the interface is shown in the following figure.

image-20231030154811722

Click on 【Show Image】 to view the image.

image-20231030154954526

After running, CameraTrajectory.txt and KeyFrameTrajectory.txt will also be saved

3. ORB_SLAM2 ROS2 camera test

3.1 Monocular testing

Start camera ORB_ SLAM2 test

image-20231030151254075

When the command is executed, there is only a green box in the interface of the ORB_SLAM2: Map Viewer. The initialization is being attempted in the interface of the ORB_SLAM2: Current Frame.

At this time, slowly move the camera up, down, left, right, and search for feature points in the screen and initialize the slam.

image-20231030151414056

Click on 【Show Image】 to view the image.

image-20231030151438658

After the test is completed, the keyframes are saved in the KeyFrameTrajectory.txt file in the following directory.

image-20230417160920005

As shown in the above figure, at this point, the camera enters the [SLAM Mode] mode and must continuously obtain each frame of image for camera positioning. If the [Localization Mode] pure positioning mode in the left image is selected, the camera will not be able to find its own position and must start from scratch to obtain key frames.

3.2 RGBD test

RGBD does not need to continuously obtain each frame of image like running a single camera.

If you choose the pure localization mode in the upper left image, you can locate the key frame just obtained.

image-20231030151048094

Click on 【Show Image】 to view the image.

image-20231030150933786

After the test is completed, the keyframes are saved in the KeyFrameTrajectory.txt file in the following directory.