• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Subscribe
  • Advertise

Sensor Tips

Sensor Product News, Tips, and learning resources for the Design Engineering Professional.

  • Motion Sensing
  • Vision systems
    • Smart cameras
    • Vision software
    • Lighting
    • Optics
  • Pressure
  • Speed
  • Temperature
  • Suppliers
  • Video
    • EE Videos
    • Teardown Videos
  • EE Learning Center
    • Design Guides
    • Tech Toolboxes

How are sensors improving maritime navigation?

April 7, 2022 By Jeff Shepard

Of course, GPS is important, but radar and the automatic identification system (AIS) use various sensors to track and report on ship positions in real-time. Sensor fusion is being developed for autonomous shipping. And there are hydrophones combined with other sensors and artificial intelligence (AI) and machine learning (ML) to identify and track whale movements (apparently, a large number of ships had been running into whales, which is not good for the ship or the whale). If things go wrong, there’s the Emergency Position Indicating Radio Beacon (EPIRB), a device to alert search and rescue services (SAR) in case of an emergency at sea. This FAQ reviews these applications of sensors in maritime navigation and how sensor fusion is being used in each case to improve system performance and accuracy.

The AIS uses self-organizing time division multiple access (SOTDMA) technology and acts like a transponder operating in the VHF maritime band. It can handle over 4,500 reports per minute with updates every two seconds. The AIS display includes a symbol for ships within radio range, each with a velocity vector for speed and heading (Figure 1). Clicking on the symbols brings up additional information such as the size and name of the ship, course and speed, classification, call sign, and more. The system also estimates the closest point of approach and the time to the closest point of approach.

Figure 1: Typical display of AIS data in the control room of a ship. (Image: Wikipedia)

Under the Safety of Life at Sea (SOLAS) Conventions, basic AIS is a mandatory system for all passenger ships regardless of size, ships larger than 300 gross tons engaged in international voyages, and all ships of over 500 gross tons, whether or not or not they are engaged in international voyages.

Sensor fusion for location

Sensor fusion is being added to increase the accuracy of AIS information. For example, fusing the basic AIS information with the ship’s radar can help plot suggested routes automatically. And an API is available that can deliver satellite-based AIS (S-AIS) data enhanced with machine learning algorithms using a 40+ constellation of nano-satellites. The S-AIS system can be further enhanced when the data is correlated with data from radar and other SAR-related software and tools. Accurate and timely information about ship location is the first step to autonomous navigation.

Sensor fusion for autonomous navigation

Like automobiles and trucks, autonomous navigation for ships is still under development and will rely on sensor fusion. The mix of sensors being considered for maritime autonomous navigation is somewhat different from the systems being developed for cars and trucks. As currently envisioned, maritime autonomous navigation will be based on four sensor modalities: sensors for precise positioning such as S-AIS data, global navigation satellite system (GNSS) receivers, wind sensors, and inertial measurement unit (IMUs); visual sensors, including stereo and monocular cameras; microphones for identifying various sounds; and radar and LiDAR for remote sensing (Figure 2).

Figure 2: Using sensor fusion, basic AIS data can be combined with radar, acoustic, visual, and other sensor inputs to create a more complete understanding of a ship’s surroundings. (Image: Finnish Geospatial Research Institute)

The perception tasks are similar to ground-based navigation and are related to well-defined problems, such as the localization of situational abnormalities and the classification of nearby vessels. AI and ML processes are being developed to solve these perception challenges. One notable difference between ships and cars or trucks is that ships are not as nimble; they move much more slowly, potentially giving more time for situational analysis. But ships also have greater inertia and change speed and direction more slowly. Ships have time to gather and process more satellite data and audio information about their surroundings.

As with cars and trucks, visual and IR cameras, radar and LiDAR provide operational tradeoffs. Visual cameras can use color to identify objects such as other ships while IR cameras can be useful in all conditions including daylight, fog, rain, and nighttime to identify objects when visual cameras are not useful. Cameras can be supplemented or supplanted with radar or LiDAR. Marine radars have traditionally operated in the S- and X-bands, but have insufficient resolution for autonomous operation. Ka- and W-band radars developed for automotive navigation are being adapted for marine environments, while LiDAR is being adopted for use in marine environments. Due to the different dynamic environment, only conventional LiDAR, not Doppler LiDAR, is needed.

In addition to improved safety, the financial benefits of developing autonomous navigation for ships are expected to be significant. Participants in the Advanced Autonomous Waterborne Applications Initiative (AAWA) have developed preliminary designs for unmanned cargo ships. The bridge, crew cabins, water supply, environmental condition, and sewage systems are eliminated from container ships in these designs. The result is a 5 percent reduction in weight, a 15 percent reduction in fuel costs, and an almost complete elimination of crew costs (crews are still needed on board when in harbors or docking), resulting in an estimated 44 percent reduction in operational costs.

Save the whales

A unique blend of sensors is being deployed in the Santa Barbara Channel off the coast of Southern California to prevent collisions between ships and the blue, humpback, and fin whales that migrate through that narrow and very busy waterway. Called ‘Whale Safe,’ the system combines acoustic buoys that listen for whale sounds with whale sightings by naturalists and guides. It overlays that data with a whale habitat model developed using data collected from 104 satellite-tagged blue whales to determine likely whale locations. That information is sent to satellites that downlink it to ships in the channel to help them avoid whale collisions (Figure 3).

Figure 3: Whale Safe is based on a unique merger of automated sensing and manual observations to create the data needed for ships to avoid colliding with whales.   (Image: Whale Safe)

The acoustic buoys use a hydrophone (underwater microphone) and an integrated computer to detect and identify sounds in the channel. The resulting audio data is transmitted via satellite to scientists who confirm if any of the sounds are whale calls and what type of whale produced the calls. Because sounds travel differently as conditions change underwater and whales communicate at different pitches and volumes, the sounds detected from the buoy only indicate that one or more whales are in the vicinity. It can’t determine how many whales are present, how far away they are, or in what direction they are from the buoy.

The coarse location information from the buoys helps to inform and direct the activities of naturalists and guides on ships and survey aircraft in the channel when they are looking for whales. When whales are sighted, observers record the date, time location, their number, and species. The Whale Alert and Spotter Pro mobile apps are used to transmit the observations to a database. The database also includes data collected during monthly aerial surveys of the Santa Barbara Channel shipping lanes. Each survey flight consists of two transects that cover approximately 180 nautical miles.

Given the imprecise nature of the audio and visual data, a whale habitat model has been developed to predict the likely locations of whales in the channel. The habitat model combines the audio and visual data with the current oceanographic conditions and uses AI/ML to predict the habitat suitability (0-100 percent) and estimate the likelihood that whales are present in a series of 10 x 10 km grid cells for the Santa Barbara Channel region. Parameters in the habitat model include salinity, chlorophyll-a concentrations, sea surface temperature, and other water column properties (measured by another series of buoys in the channel). When the model identifies a high probability of whales, it issues a warning to ships in the corresponding grid cells, enabling the ships to optimize their resources to avoid whale collisions.

Saving seafarers

An Emergency Position Indicating Radio Beacon (EPIRB) is a device to alert search and rescue services (SARs) in case of an emergency at sea. EPIRBs are supported by the S-AIS system that combines AIS devices and a series of satellites. Sensor fusion to support the current EPIRB system includes ground-based and space-based elements (Figure 4), including:

  • The EPIRB that is activated in a life-threatening situation.
  • SAR repeaters (SARRs) and SAR signal processors (SARPs) in orbiting satellites. A SARR or SARP is a secondary payload carried by several types of satellites;
    • Five satellites in polar low-altitude Earth orbits,
    • Eleven satellites in geostationary Earth orbits, and
    • 46 satellites in medium-altitude Earth orbits.
  • Satellite downlink receiving and signal processing ground-based terminals.
  • Mission control centers (MCCs) that distribute beacon location data to rescue coordination centers (RCCs).
  • RCCs to coordinate the actual rescue missions.
Figure 4: The EPIRB network combines marine, terrestrial, and satellite resources to save the lives of seafarers. (Image: Marine Expert)

Traditional terrestrial-based coastal AIS provides coverage only up to 50 nautical miles offshore. S-AIS provides global coverage. S-AIS also supports correlation with independent sensor systems such as shipboard radars, optical imaging satellites, and other SAR-related tools. Correlating various data sources provides the ability to identify all the vessels in need and helps optimize rescue efforts.

Summary

The range of sensors used in maritime applications is wider and more complex than most terrestrial systems. Sensors and sensor fusion are used across a range of maritime activities, from ship location to saving whales and the lives of seafarers. In the future, sensor fusion is expected to enable the development of autonomous containers and other ships, saving money and reducing shipping costs.

References

Artificial Intelligence / Machine Learning Sensor Fusion for Autonomous Vessel Navigation, IEEE Transactions on Intelligent Transportation Systems
Automatic Identification System, Department of Homeland Security
Automatic Identification System, Wikipedia
Maritime AI-NAV, Finnish Geospatial Research Institute
Unmanned Ships: Navigation and More, Gyroscopy and Navigation
Whale Safe Methodology, Whale Safe

 

You may also like:


  • What sensors do you need to land on the moon?
  • PIGA
    PIGAs can indeed fly — and are still the best,…

  • What is sensor fusion?

  • Sensor fusion levels and architectures
  • sensor fusion algorithm
    Sensor fusion – How does that work?

Filed Under: Featured, Frequently Asked Question (FAQ), GPS (Global Positioning System), Inertial Measurement Unit (IMU), Other, RADAR/LiDAR, Sensor Fusion Tagged With: FAQ

Primary Sidebar

Featured Contributions

Integrating MEMS technology into next-gen vehicle safety features

Fire prevention through the Internet

Beyond the drivetrain: sensor innovation in automotive

Sensors in American football can help the game

Select and integrate sensors into IoT devices

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE LEARNING CENTER

EE Learning Center
“sensor
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

RSS Current EDABoard.com discussions

  • resonator mode being excited
  • Testing 5kW Grid Tied inverter over 200-253VAC
  • Getting different output for op amp circuit
  • What is the reason for using the -allow_path option in set_clock_groups?
  • LTSpice simulation not completing

RSS Current Electro-Tech-Online.com Discussions

  • what's it's name
  • using a RTC in SF basic
  • What is correct names for GOOD user friendly circuit drawing program?
  • Curved lines in PCB design
  • Is AI making embedded software developers more productive?

EE ENGINEERING TRAINING DAYS

engineering
“bills

RSS Featured White Papers

  • 4D Imaging Radar: Sensor Supremacy For Sustained L2+ Vehicle Enablement
  • Amphenol RF solutions enable the RF & video signal chains in industrial robots
  • Implementing Position Sensors for Hazardous Areas & Safety

DesignFast

Component Selection Made Simple.

Try it Today
design fast globle

Footer

EE WORLD ONLINE NETWORK

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Microcontroller Tips
  • Power Electronic Tips
  • Test and Measurement Tips

SENSOR TIPS

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy