• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Subscribe
  • Advertise

Sensor Tips

Sensor Product News, Tips, and learning resources for the Design Engineering Professional.

  • Motion Sensing
  • Vision systems
    • Smart cameras
    • Vision software
    • Lighting
    • Optics
  • Pressure
  • Speed
  • Temperature
  • Suppliers
  • Video
    • EE Videos
    • Teardown Videos
  • EE Learning Center
    • Design Guides
    • Tech Toolboxes

How does SLAM help autonomous vehicles?

April 25, 2022 By Randy Frank Leave a Comment

SLAM (simultaneous localization and mapping) is used in autonomous vehicles of all types to build a map and simultaneously locate the vehicle in that map. SLAM algorithms allow the vehicle to analyze and map out unknown environments. With this map information, the vehicle can implement path planning and obstacle avoidance.

Visual SLAM uses cameras and/or laser range finders. Other alternatives, especially where visual sensors are not effective, use non-visible data sources and basic positional data from an inertial measurement unit (IMU) or include tactile SLAM that uses touch sensors and acoustic SLAM that uses microphones.

Visual SLAM algorithms use two different techniques to estimate the camera motion. The indirect approach uses the feature points of images to minimize reprojection errors. In contrast, the direct method uses the overall brightness of images to minimize photometric errors. Using the sensor data, the device’s algorithms compute an initial estimate of where it is.  As new positional information is collected, the estimate is revised and improved.

A home robot vacuum with SLAM, demonstrates the benefits of the design technique. Without SLAM, it would move randomly within a room and require many passes to clean the entire area. With SLAM, it implements a systematic approach to completely clean while minimizing travel and power consumption for increased battery life.

In the localization phase, robots with SLAM can use data from cameras and other imaging sensors and other information, such as the number of wheel revolutions, to determine the amount of movement needed. With the mapping capability, the robot can simultaneously use the camera and other sensors to create a map of the obstacles to avoid and a path plan to avoid cleaning the same area twice.

A real-world example is the yeedi vac max – the first 2-in-1 robot vacuum/mop to clean both hard wood floors and carpets. Advanced Carpet Identification sensors using ultrasonic technology detect the transition from carpet to wood floors and automatically adjust the machine’s settings so it does not to mop on carpets and increases suction power when it detects carpeting.

Visual SLAM sensor-based mapping technology provides an accurate map view that is customizable with virtual boundaries, designated areas and cleaning schedules. The system maps out the floor similar to GPS so cleaning can be performed in consistent rows. With its editable home map, the user can click on the map to clean or avoid certain rooms or targeted areas.

Resources:

https://geoslam.com/what-is-slam/

https://www.mathworks.com/help/vision/ug/visual-simultaneous-localization-and-mapping-slam-overview.html

https://www.mathworks.com/discovery/slam.html

Filed Under: Featured, Frequently Asked Question (FAQ), Image sensing, Smart cameras, Vision software Tagged With: yeedi

Reader Interactions

Leave a Reply

You must be logged in to post a comment.

Primary Sidebar

Featured Contributions

Automotive sensor requirements for software-defined vehicles: latency, resolution, and zonal architecture

High-current, low-impedance systems need advanced current sensing technology

A2L refrigerants drive thermal drift concerns in HVAC systems

Integrating MEMS technology into next-gen vehicle safety features

Fire prevention through the Internet

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: Aerospace & Defense
This Tech Toolbox dives into the technical realities of modern defense, exploring how MBSE is streamlining aerospace design and what’s next for radar and electronic warfare.

EE LEARNING CENTER

EE Learning Center
“sensor
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

RSS Current EDABoard.com discussions

  • Forward Converter Rectifier Diodes - Schottky or Ultra-fast soft recovery?
  • N5224B PNA Microwave Network Analyzer Automatic Fixture Removal property question
  • Design Review: 1.2kW Half-Bridge Converter - Driving IRFP260N at 100kHz using UCC21520
  • Mains electrical safety testing and standards?
  • isolate DC voltage to read on ADC

RSS Current Electro-Tech-Online.com Discussions

  • PCB manufacturing issues
  • Bot checks
  • Convenient audio FFT module?
  • SDCC Array Access In Timer 0 Interrupt Handler
  • Assistance locating a 'trail' camera gadget, please ?

EE ENGINEERING TRAINING DAYS

engineering
“bills

RSS Featured White Papers

  • 4D Imaging Radar: Sensor Supremacy For Sustained L2+ Vehicle Enablement
  • Amphenol RF solutions enable the RF & video signal chains in industrial robots
  • Implementing Position Sensors for Hazardous Areas & Safety

Footer

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Test and Measurement Tips

SENSOR TIPS

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2026 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy