5. ORB_SLAM2 Basics

Official website: http://webdiis.unizar.es/~raulmur/orbslam/

TUM Dataset: http://vision.in.tum.de/data/datasets/rgbd-dataset/download

KITTI Dataset: http://www.cvlibs.net/datasets/kitti/eval_odometry.php

EuRoC Dataset: http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets

orb_slam2_ros: http://wiki.ros.org/orb_slam2_ros

ORB-SLAM: https://github.com/raulmur/ORB_SLAM

ORB-SLAM2: https://github.com/raulmur/ORB_SLAM2

ORB-SLAM3: https://github.com/UZ-SLAMLab/ORB_SLAM3

5.1, Introduction

ORB-SLAM is mainly used for monocular SLAM;

ORB-SLAM2 version supports three interfaces: monocular, binocular and RGBD;

ORB-SLAM3 version adds IMU coupling and supports fisheye cameras.

All steps of ORB-SLAM use the ORB features of the image uniformly. ORB features are a very fast feature extraction method with rotation invariance and can be used to construct scale invariance using pyramids. Using unified ORB features helps SLAM algorithms to be consistent in feature extraction and tracking, keyframe selection, 3D reconstruction, loop closure detection and other steps. The system is also robust to violent motion and supports loop closure detection and relocalization of wide baselines, including fully automatic initialization. Since the ORB-SLAM system is a feature point-based SLAM system, it can calculate the trajectory of the camera in real time and generate sparse 3D reconstruction results of the scene.

Based on ORB-SLAM, ORB-SLAM2 contributes:

  1. The first open source SLAM system for monocular, stereo and RGBD cameras, including loop closure and relocalization and map reuse.

  2. RGBD results show that using BA can achieve more accuracy than ICP or minimization based on photometric and depth errors.

  3. By using far and near points in the stereo, as well as monocular observations, the stereo results are more accurate than direct stereo SLAM algorithms.

  4. The light positioning mode can effectively reuse maps.

ORB-SLAM2 includes modules common to all SLAM systems: tracking, mapping, relocalization, and loop closing. The following figure shows the process of ORB-SLAM2.

12.5.1

5.2, Official Case

5.2.1, Dataset Location

EuRoC's MH01 dataset and TUM's rgbd_dataset_freiburg1_xyz dataset were downloaded.

image-20230417111610512

If you need other datasets, you can download them from the following address:

5.2.2, Monocular Test

Here, take the EuRoC dataset as an example and enter the ORB_SLAM2 directory:

Run the following command:

The successful operation interface is as shown below,

image-20231030153639673

The blue box is the key frame, the green box is the camera orientation, the black point is the saved point, and the red point is the point currently seen by the camera.

Click [Show Image] to view the image.

image-20231030153713905

After the test, the keyframe is saved to the KeyFrameTrajectory.txt file in the current directory:

5.2.4, Binocular Test

Here, take the EuRoC dataset as an example, enter the ORB_SLAM2 directory, and run the following command:

Run the following command,

The successful operation interface is as follows:

image-20231030154344371

The blue box is the key frame, the green box is the camera orientation, the black point is the saved point, and the red point is the point currently seen by the camera.

Click [Show Image] to view the image.

image-20231030154418875

After the test, the keyframe is saved to the CameraTrajectory.txt file in the current directory

5.2.5, RGBD test

Enter the ORB_SLAM2 directory and enter the following command:

Run the following command,

The successful operation interface is as follows:

image-20231030154811722

Click [Show Image] to view the image.

image-20231030154954526

CameraTrajectory.txt and KeyFrameTrajectory.txt will also be saved after the run

5.3, ORB_SLAM2 ROS2 camera test

5.3.1, monocular test

Start camera ORB_SLAM2 test

image-20231030151254075

When the command is executed, there is only a green box in the [ORB_SLAM2:Map Viewer] interface, and the [ORB_SLAM2:Current Frame] interface is trying to initialize. At this time, slowly move the camera up, down, left, and right to find feature points in the picture and initialize slam.

image-20231030151414056

Click [Show Image] to view the image.

image-20231030151438658

After the test, the keyframe is saved in the KeyFrameTrajectory.txt file in the following directory:

image-20230417160920005

As shown in the figure above, enter the [SLAM MODE] mode at this time. When running the monocular, each frame of the image must be continuously acquired to locate the camera. If the pure positioning mode [Localization Mode] in the upper left figure is selected, the camera will not be able to find its own position and must start acquiring keyframes again.