What is SLAM technology?
Update Time: 2022-11-11 17:33:31
Contents
What is SLAM technology?
Simultaneous Localization and Mapping (SLAM) is a technology field that addresses the localization and mapping of robots moving in unknown environments.
Simply put, SLAM is a technology that allows a robot to acquire information about its environment through sensors, where it is, where it is going, how it is going, and what is in front of it. Then the system will get its orientation and path planning based on the environmental information.
A little hard to understand? No problem, let's take an example.
Let's say you are on a business trip to an unfamiliar city. To familiarize yourself quickly with the environment and complete your task of checking into a hotel, you should do the following things.
1. Feature extraction
Observe the surroundings with your eyes and remember their features.
2. Map construction
Construct a 2D or 3D map of the environment in your brain based on the information obtained from your eyes.
3. Bundle Adjustment or EKF
As you walk, you constantly acquire new features and landmarks and adjust your mental map model.
4. Trajectory
Determine your position based on the feature landmarks you have acquired during the previous walk.
5. Loop-closure Detection
When you have unintentionally walked a long way, match the landmarks in your mind to see if you have returned to the original path.
The above five steps are performed simultaneously, called Simultaneous Localization and Mapping.
As an indispensable and important technology for autonomous mobile robots, SLAM technology is receiving more and more attention.
SLAM technology is widely used in robotics, UAV, driverless, AR, VR, and other fields, relying on sensors to achieve autonomous localization, map construction, path planning, autonomous navigation, and other machine functions.
Laser SLAM or vision SLAM?
The sensors currently used in SLAM are mainly divided into two categories: Lidar-based laser SLAM (Lidar SLAM) and vision-based VSLAM (Visual SLAM).
Visual SLAM, like the eye, is the main source of external information and can acquire massive and redundancy-rich texture information from the environment, which is the advantage of visual SLAM.
The camera is often used as the "eyes" of the robot because of its small size, low energy consumption, and low cost, which is the basis of visual SLAM.
The robot uses the camera's image information as a basis to map out its surroundings and then transmits it to the "brain." Finally, the system makes a judgment to complete the robot's positioning.
This technology is difficult and complex to process information, and it is easily affected by lighting conditions, so in some cases, visual SLAM is not enough.
That's why laser SLAM is here to help.
Laser SLAM uses 2D or 3D LiDAR (single- or multi-line LiDAR). 2D LiDAR is generally used on indoor robots (such as floor sweepers), while 3D LiDAR is generally used in unmanned vehicles, robots, AMR/AGV, etc. The emergence and popularity of LiDAR have led to faster and more accurate measurements and richer information.
LIDAR distance measurement is more accurate, the error model is simple, the operation is stable outside the special environment, and the point cloud processing is easier, which can fully adapt to the dynamic changing environment. Laser SLAM theoretical research is also relatively mature, and the corresponding products are more abundant.
Through comparison, it is found that laser SLAM and vision SLAM have their strengths and limitations individually, while the fusion complements each other's strengths and weaknesses.
For example, vision works stably in dynamic environments with rich textures. It can provide very accurate point cloud matching for laser SLAM, while the precise direction and distance information provided by LiDAR will be more powerful on correctly matched point clouds.
In environments with severe light deficits or missing textures, the positioning work of laser SLAM allows vision to record scenes with little information.
Future Applications
SLAM technology has already achieved good landing results and achievements in many fields, including indoor mobile robots, AR/VR, drones, uncrewed vehicles, and so on.
In the future, the continuous improvement of sensor accuracy and the gradual reduction of cost will bring revolutionary changes to more industry fields.
As SLAM technology becomes hot, more and more talents will come into the field of mobile robotics, injecting more fresh blood and bringing new technical directions and research fields.
Previous: CR1220 Battery Equivalent, Specification, Application
Next: 1N5817 Schottky Diode Datasheet, Equivalent, and Pinout
Ratings and Reviews
Related Products
-
JANS1N6152A
Microchip
TVS DIODE 20.6VWM 37.4VC C AXIAL > -
DRF1510
Microchip
RF MOSFET 500V > -
AVR128DA64-I/PT
Microchip
IC MCU 8BIT 128KB FLASH 64TQFP > -
MIC24051YJLTR
Microchip
QFN-28 > -
MIC33153-SYHJ-TR
Microchip
4MHZ PWM 1.2A INTERNAL INDUCTOR > -
MIC5317-2.5YM5 TR
Microchip
LDO Regulator Pos 2.5V 0.15A 5-Pin SOT-2 > -
SST26WF080BT-104I/NP
Microchip
NOR Flash Serial-SPI 1.8V 8M-bit 8M/4M/2 > -
PIC18F46K80T-E/PTVAO
Microchip
TQFP-44 > -
MIC860YC5-TR
Microchip
MIC860 Series 5.25V 4MHz Teeny Ultra Low > -
MCP47FEB22A0-E/ST
Microchip
Digital to Analogue Converter, 12 bit, 3 > -
MCP2517FD-H/JHA
Microchip
CAN Controller 8Mbit/s CAN 2.0B, 14-Pin > -
AT24C32E-MAHM-T
Microchip
EEPROM Serial-2Wire 32K-bit 4K x 8 1.8V/ > -
ATSAML21J18B-AUT
Microchip
ATMEL ATSAML21J18B-AUT 32Bit Microcontro > -
24AA512-ISN
Microchip
24AA512 Series 512 Kbit (64K X 8) 5.5 V > -
VN2110K1-G
Microchip
Trans MOSFET N-CH Si 100V 0.2A 3-Pin SOT >
Hot Stocks
More- USB2514BI-AEZG
- TSC87251G2D-24CB
- TC9400COD
- TC7660SCOA
- TC622EPA
- SY58040UMY
- PM5337-FGI
- PIC32MX340F512H-80I/PT
- PIC24FJ256DA206-I/PT
- PIC24F08KA101-I/SS
- PIC18F4580-E/PT
- PIC18F26K20-I/SS
- PIC18F26J50-I/ML
- PIC18F2550-I/SO
- PIC16LF723A-I/ML
- PIC16F883T-I/SO
- PIC16C54C-04
- PIC12F1840T-I/SN
- MT8889CE1
- MIC61150YMME
- MIC5841YWM
- MIC39100-3.3WS
- MIC29302WU-TR
- MIC2287CBD5
- MCP73831T-2ACI/MC
- MCP6547-E/SN
- MCP6044-I/SL
- MCP2551-ISN
- DSPIC33EP256MU806-E/PT
- ATMEGA8L-8MI
- ATMEGA8535L-8JU
- ATMEGA8-16PI
- ATMEGA644PV-10AU
- ATMEGA644-20PU
- ATMEGA640V-8AU
- AT90S2333-8PI
- AT45DB161B-CNI
- AT32UC3C0512C-ALUT
- AT29C040A-12TC
- AT28C256-20JC
- AT28C256-15PI
- AT27C256R-45RU
- AT24C64CN-SH-B
- 25LC160
- 24AA16T-I/OT