• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Subscribe
  • Advertise

Sensor Tips

Sensor Product News, Tips, and learning resources for the Design Engineering Professional.

  • Motion Sensing
  • Vision systems
    • Smart cameras
    • Vision software
    • Lighting
    • Optics
  • Pressure
  • Speed
  • Temperature
  • Suppliers
  • Video
    • EE Videos
    • Teardown Videos
  • EE Learning Center
    • Design Guides
    • Tech Toolboxes

What sensors are used in AR/VR systems?

May 19, 2022 By Jeff Shepard Leave a Comment

A wide range of sensor technologies is required to support augmented reality and virtual reality (AR/VR) systems. Today’s AR/VR implementations are primarily focused on visual and audio interfaces, relying on motion tracking and voice recognition sensors. That’s beginning to change with the introduction of new types of sensors and various types of haptics.

This FAQ begins by reviewing several of the basic position sensing technologies used in current AR/VR systems, presents a proposed AR/VR system for neurorehabilitation and quality of life improvement, reviews how thermal sensing technologies are being developed to provide more complete AV/VR environments and closes by looking at emerging haptics technologies for thermal- and touch-based feedback.

AR involves creating an environment that integrates existing surroundings with virtual elements. In a virtual reality environment, the system creates the complete ‘reality’ and only needs to know about the person’s relative movements and orientation. A basic VR system uses an inertial measurement unit (IMU) that can include an accelerometer, a gyroscope, and a magnetometer. An AR system needs to know where the person is, but it also needs to understand what the person is seeing, hearing, and how the environment is changing, among other things. As a result, AR uses more complex sensing, starting with an IMU and adding time-of-flight sensors, heat mapping, structured light sensors, and others.

Figure 1: Augmented reality that integrates existing surroundings with virtual elements requires more complex sensing than creating a completely virtual environment. (Image: Association for Computing Machinery)

Both AR and VR are immersive environments, especially for VR, complete immersion is necessary for a good user experience. Motion sickness is a potential challenge when developing AR/VR systems. AR/VR devices must capture user movements quickly and accurately to avoid motion sickness. The IMUs and other motion sensor technologies must be highly stable with very low latencies. IMUs used in AR/VR systems combine an accelerometer, a gyroscope, and a magnetometer to facilitate error correction and quickly produce accurate results, enabling the system to track head movement and position.

In addition, both AR and VR systems can benefit from various forms of user feedback and interaction, including gesture and voice recognition. Gesture recognition can be based on real-time video analysis (which can be energy and compute-intensive) or more advanced technologies such as e-field sensing, LIDAR, and advanced capacitive technologies. For a discussion of gesture recognition, see the FAQ “How can a machine recognize hand gestures?”

One key for AR environments is to accurately, quickly, and continuously present computer-generated images of the actual environment. Most AR headsets rely on one or more special types of imaging sensors, including time of flight (ToF) cameras, vertical-cavity surface-emitting laser (VCSEL) based light detection and ranging (LiDAR), binocular depth sensing, or structured-light sensors. Some use a combination of these sensors.

ToF cameras combine a modulated IR light source with a charged coupled device (CCD) image sensor. They measure the phase delay (time of flight) of the reflected light to calculate the distance between the illuminated object and the camera. Thousands, or millions, of these measurements combine into a ‘point cloud’ database that represents a three-dimensional image of the surrounding area. A more recently developed technology, VCSEL-based LiDAR, can produce higher fidelity point clouds. In addition, VCSEL technology is used in smart glasses and other wearable devices to produce more compact and lower-power displays, as well as gesture recognition systems.

Structured light sensors project a defined pattern of light (IR or visible) onto the surroundings. The distortion of the patterned light is analyzed mathematically to triangulate the distance to various points in the surroundings: the camera pixel data (a type of primitive point cloud) is analyzed to calculate the difference between the projected pattern of light and the returned pattern, taking into account camera optics and other factors to determine the distances to various objects and surfaces in the surroundings. While a single structured light sensor can be used, more accurate results are realized when two structured light sensors are used and their outputs are combined.

Additional sensors found in AR systems include directional microphones (which may be replaced in the future by bone-conduction directional audio transducers), various biosensors such as thermal sensors and ambient light sensors, as well as forward- and rear-facing video cameras. Additionally, a wireless link is required to download all sensor data and upload video information for the real-time creation of an immersive and dynamic environment.

Neurorehabilitation

AR/VR technologies are being developed for various medical applications. In one instance, a system for immersive neurorehabilitation exercises using virtual reality (INREX-VR) is being developed based on VR technologies. The INREX-VR system captures real-time user movements and evaluates joint mobility for both upper and lower limbs, recording training sessions and saving electromyography data. A virtual therapist demonstrates the exercises, and the sessions can be configured and monitored using telemedicine connectivity.

Figure 2: Hardware components of the experimental INREX-VR system for immersive neurorehabilitation. (Image: MDPI sensors)

The INREX-VR software was developed to utilize off-the-shelf VR hardware and is suitable for use by patients with neurological disorders, therapists supervising patient rehabilitation, and trainees. The system can evaluate the user’s emotional condition through heart-rate monitoring and their level of stress using skin conductance. Facial recognition can also be implemented when the system is used with a virtual reality (VR) headset.

Heating up future AR/VR

Devices that can add a thermal dimension to AR/VR environments are a ‘hot topic.’ Various thermal sensing and thermal haptics technologies are being pursued (Figure 3). Thermal sensing technologies proposed for AR/VR systems include thermoresistive, pyroelectric, thermoelectric, and thermogalvanic. While these sensors convert thermal energy into electric currents or voltages, the power levels are too low to support energy harvesting but are sufficient to enable thermal sensing. In particular, polymer composites and conducting polymers are expected to find use in future thermal sensors. The mechanical flexibility of polymers makes them especially suitable for wearable thermal sensors. Some of the proposed polymer-based thermosensors can function in contact and non-contact modes, further increasing flexibility. Stability of performance and reaction delays resulting from thermal hysteresis are two challenges that thermosensors face before they are ready for use in AV/VR systems.

Figure 3: A range of technologies are being developed to implement thermal sensing and thermal haptics in AR/VR environments. (Image: Advanced Functional Materials)

Thermal haptics, also known as thermal stimulation, is another emerging area of AR/VR that will provide an enhanced level of immersion in AR/VR environments. Thermal haptics are expected to rely on thermoresistive heaters or Peltier devices, which control the temperature or a target area but are more power-hungry than polymer-based thermal sensors. On the plus side, active thermal haptics may have faster response times than the proposed polymer-based thermal sensors. With thermal haptics, users can feel the temperature of a virtual object and have a more realistic interaction with their environment.

Thermoresistive heaters may have lower hysteresis than Peltier devices, but thermoresistive heaters only provide heating, while Peltier devices can provide both heating and cooling. A key factor in developing thermal haptics will be the need for user controls that allow the selection of a range of temperatures, providing stimulation without causing discomfort or burns. New materials will be needed for wearable Peltier devices that are both flexible and lightweight. Materials development is a key activity to enable both thermal sensing and thermal haptics in future AR/VR systems.

Ultrasound haptics

Ultrasound-based haptics have been developed based on the precise control of a series of speakers that send out precisely-timed ultrasound pulses. The time differences are designed so that the ultrasound waves from the various speakers arrive at the same place simultaneously and create a pressure point that can be felt by human skin (Figure 4). The focal point where the ultrasound waves meet can be controlled in real-time and changed from moment to moment based on hand positioning or other factors.

Figure 4: The combined vibrations of the ultrasound waves at the focal point create a pressure point that can be felt by human skin. (Image: ultraleap)

A tracking camera serves the dual purpose of gesture recognition, providing the exact position of a person’s hand. This enables the focal point to be positioned at a specific spot on the hand. By controlling the motion of the focal point, it’s possible to create multiple tactile effects. Haptic feedback is expected to be used in AV/VR systems beyond gaming, for example, system controls and interactions for shopping kiosks and vending machines, automotive interiors, building automation, and other areas.

Summary

Sensors are a key area of technology that enable compelling and immersive AR and VR environments. Due to the greater complexity of creating AR environments that blend real-world surroundings with virtual elements, AR sensor systems are more complex. New sensor modalities are being developed to enhance the realism of AR/VR systems, including thermal sensors and haptics that combine thermal and tactile sensations.

References

Consumer AR Could Have Saved Lives, Economy, Association for Computing Machinery
Emerging Thermal Technology Enabled Augmented Reality, Advanced Functional Materials
Flexible Virtual Reality System for Neurorehabilitation and Quality of Life Improvement, MDPI sensors
Sensors for AR/VR, Pressbooks
Touch is going virtual, ultraleap

 

 

You may also like:


  • How can a machine recognize hand gestures?

  • Want to feel better?
  • PIGA
    PIGAs can indeed fly — and are still the best,…

  • Sensors, sensors everywhere. Now what do I do?

  • How can you improve the next-generation IMU?

Filed Under: Featured, Frequently Asked Question (FAQ), Inertial Measurement Unit (IMU), Other, Time-of-Flight Tagged With: FAQ

Reader Interactions

Leave a Reply

You must be logged in to post a comment.

Primary Sidebar

Featured Contributions

Automotive sensor requirements for software-defined vehicles: latency, resolution, and zonal architecture

High-current, low-impedance systems need advanced current sensing technology

A2L refrigerants drive thermal drift concerns in HVAC systems

Integrating MEMS technology into next-gen vehicle safety features

Fire prevention through the Internet

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: Connectivity
AI and high-performance computing demand interconnects that can handle massive data throughput without bottlenecks. This Tech Toolbox explores the connector technologies enabling ML systems, from high-speed board-to-board and PCIe interfaces to in-package optical interconnects and twin-axial assemblies.

EE LEARNING CENTER

EE Learning Center
“sensor
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

RSS Current EDABoard.com discussions

RSS Current Electro-Tech-Online.com Discussions

  • updating circuit with power on off switching
  • potenciometer attachment / screwdriver
  • Steering angle sensor question
  • flexible copper cable
  • factory device from 2017'ish with web ui - too old to function with Microsoft Edge ?

EE ENGINEERING TRAINING DAYS

engineering
“bills

RSS Featured White Papers

  • 4D Imaging Radar: Sensor Supremacy For Sustained L2+ Vehicle Enablement
  • Amphenol RF solutions enable the RF & video signal chains in industrial robots
  • Implementing Position Sensors for Hazardous Areas & Safety

Footer

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Test and Measurement Tips

SENSOR TIPS

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2026 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy