What is SLAM technology?
Update Time: 2022-11-11 17:33:31
What is SLAM technology?
Simultaneous Localization and Mapping (SLAM) is a technology field that addresses the localization and mapping of robots moving in unknown environments.
Simply put, SLAM is a technology that allows a robot to acquire information about its environment through sensors, where it is, where it is going, how it is going, and what is in front of it. Then the system will get its orientation and path planning based on the environmental information.
A little hard to understand? No problem, let's take an example.
Let's say you are on a business trip to an unfamiliar city. To familiarize yourself quickly with the environment and complete your task of checking into a hotel, you should do the following things.
1. Feature extraction
Observe the surroundings with your eyes and remember their features.
2. Map construction
Construct a 2D or 3D map of the environment in your brain based on the information obtained from your eyes.
3. Bundle Adjustment or EKF
As you walk, you constantly acquire new features and landmarks and adjust your mental map model.
Determine your position based on the feature landmarks you have acquired during the previous walk.
5. Loop-closure Detection
When you have unintentionally walked a long way, match the landmarks in your mind to see if you have returned to the original path.
The above five steps are performed simultaneously, called Simultaneous Localization and Mapping.
As an indispensable and important technology for autonomous mobile robots, SLAM technology is receiving more and more attention.
SLAM technology is widely used in robotics, UAV, driverless, AR, VR, and other fields, relying on sensors to achieve autonomous localization, map construction, path planning, autonomous navigation, and other machine functions.
Laser SLAM or vision SLAM?
The sensors currently used in SLAM are mainly divided into two categories: Lidar-based laser SLAM (Lidar SLAM) and vision-based VSLAM (Visual SLAM).
Visual SLAM, like the eye, is the main source of external information and can acquire massive and redundancy-rich texture information from the environment, which is the advantage of visual SLAM.
The camera is often used as the "eyes" of the robot because of its small size, low energy consumption, and low cost, which is the basis of visual SLAM.
The robot uses the camera's image information as a basis to map out its surroundings and then transmits it to the "brain." Finally, the system makes a judgment to complete the robot's positioning.
This technology is difficult and complex to process information, and it is easily affected by lighting conditions, so in some cases, visual SLAM is not enough.
That's why laser SLAM is here to help.
Laser SLAM uses 2D or 3D LiDAR (single- or multi-line LiDAR). 2D LiDAR is generally used on indoor robots (such as floor sweepers), while 3D LiDAR is generally used in unmanned vehicles, robots, AMR/AGV, etc. The emergence and popularity of LiDAR have led to faster and more accurate measurements and richer information.
LIDAR distance measurement is more accurate, the error model is simple, the operation is stable outside the special environment, and the point cloud processing is easier, which can fully adapt to the dynamic changing environment. Laser SLAM theoretical research is also relatively mature, and the corresponding products are more abundant.
Through comparison, it is found that laser SLAM and vision SLAM have their strengths and limitations individually, while the fusion complements each other's strengths and weaknesses.
For example, vision works stably in dynamic environments with rich textures. It can provide very accurate point cloud matching for laser SLAM, while the precise direction and distance information provided by LiDAR will be more powerful on correctly matched point clouds.
In environments with severe light deficits or missing textures, the positioning work of laser SLAM allows vision to record scenes with little information.
SLAM technology has already achieved good landing results and achievements in many fields, including indoor mobile robots, AR/VR, drones, uncrewed vehicles, and so on.
In the future, the continuous improvement of sensor accuracy and the gradual reduction of cost will bring revolutionary changes to more industry fields.
As SLAM technology becomes hot, more and more talents will come into the field of mobile robotics, injecting more fresh blood and bringing new technical directions and research fields.
M2S150T-1FCG1152ES FPGA Development Kit >
Xplained Pro Radio Transceiver Extension >
DALI Starter Kit For Lighting Communicat >
MCU 8-bit PIC RISC 14KB Flash 2.5V/3.3V/ >
Ethernet Controller, 100 Mbps, IEEE 802. >
ARM MCU, SAM D Series, SAM32 Family SAM >
IC SMART LOAD SWITCH >
Trans MOSFET N-CH Si 100V 0.2A 3-Pin SOT >
Low Speed/Full Speed/High Speed USB 2.0 >
3-Pin Microcontroller Reset Monitors,Sup >
Thermal Sensor with SPI Interface,Board >
Driver 1.5A 2-OUT Low Side Non-Inv Autom >
Voltage Supervisor with Manual Reset Inp >
Clock Divider, Fanout Buffer, 2.5 GHz, 3 >
IC, SM TRNSLTR FANOUT BUFFER 1:5IC, SM T >