Of course, GPS is important, but radar and the automatic identification system (AIS) use various sensors to track and report on ship positions in real-time. Sensor fusion is being developed for autonomous shipping. And there are hydrophones combined with other sensors and artificial intelligence (AI) and machine learning (ML) to identify and track whale movements (apparently, a large number of ships had been running into whales, which is not good for the ship or the whale). If things go wrong, there’s the Emergency Position Indicating Radio Beacon (EPIRB), a device to alert search and rescue services (SAR) in case of an emergency at sea. This FAQ reviews these applications of sensors in maritime navigation and how sensor fusion is being used in each case to improve system performance and accuracy.
The AIS uses self-organizing time division multiple access (SOTDMA) technology and acts like a transponder operating in the VHF maritime band. It can handle over 4,500 reports per minute with updates every two seconds. The AIS display includes a symbol for ships within radio range, each with a velocity vector for speed and heading (Figure 1). Clicking on the symbols brings up additional information such as the size and name of the ship, course and speed, classification, call sign, and more. The system also estimates the closest point of approach and the time to the closest point of approach.
Under the Safety of Life at Sea (SOLAS) Conventions, basic AIS is a mandatory system for all passenger ships regardless of size, ships larger than 300 gross tons engaged in international voyages, and all ships of over 500 gross tons, whether or not or not they are engaged in international voyages.
Sensor fusion for location
Sensor fusion is being added to increase the accuracy of AIS information. For example, fusing the basic AIS information with the ship’s radar can help plot suggested routes automatically. And an API is available that can deliver satellite-based AIS (S-AIS) data enhanced with machine learning algorithms using a 40+ constellation of nano-satellites. The S-AIS system can be further enhanced when the data is correlated with data from radar and other SAR-related software and tools. Accurate and timely information about ship location is the first step to autonomous navigation.
Sensor fusion for autonomous navigation
Like automobiles and trucks, autonomous navigation for ships is still under development and will rely on sensor fusion. The mix of sensors being considered for maritime autonomous navigation is somewhat different from the systems being developed for cars and trucks. As currently envisioned, maritime autonomous navigation will be based on four sensor modalities: sensors for precise positioning such as S-AIS data, global navigation satellite system (GNSS) receivers, wind sensors, and inertial measurement unit (IMUs); visual sensors, including stereo and monocular cameras; microphones for identifying various sounds; and radar and LiDAR for remote sensing (Figure 2).
The perception tasks are similar to ground-based navigation and are related to well-defined problems, such as the localization of situational abnormalities and the classification of nearby vessels. AI and ML processes are being developed to solve these perception challenges. One notable difference between ships and cars or trucks is that ships are not as nimble; they move much more slowly, potentially giving more time for situational analysis. But ships also have greater inertia and change speed and direction more slowly. Ships have time to gather and process more satellite data and audio information about their surroundings.
As with cars and trucks, visual and IR cameras, radar and LiDAR provide operational tradeoffs. Visual cameras can use color to identify objects such as other ships while IR cameras can be useful in all conditions including daylight, fog, rain, and nighttime to identify objects when visual cameras are not useful. Cameras can be supplemented or supplanted with radar or LiDAR. Marine radars have traditionally operated in the S- and X-bands, but have insufficient resolution for autonomous operation. Ka- and W-band radars developed for automotive navigation are being adapted for marine environments, while LiDAR is being adopted for use in marine environments. Due to the different dynamic environment, only conventional LiDAR, not Doppler LiDAR, is needed.
In addition to improved safety, the financial benefits of developing autonomous navigation for ships are expected to be significant. Participants in the Advanced Autonomous Waterborne Applications Initiative (AAWA) have developed preliminary designs for unmanned cargo ships. The bridge, crew cabins, water supply, environmental condition, and sewage systems are eliminated from container ships in these designs. The result is a 5 percent reduction in weight, a 15 percent reduction in fuel costs, and an almost complete elimination of crew costs (crews are still needed on board when in harbors or docking), resulting in an estimated 44 percent reduction in operational costs.
Save the whales
A unique blend of sensors is being deployed in the Santa Barbara Channel off the coast of Southern California to prevent collisions between ships and the blue, humpback, and fin whales that migrate through that narrow and very busy waterway. Called ‘Whale Safe,’ the system combines acoustic buoys that listen for whale sounds with whale sightings by naturalists and guides. It overlays that data with a whale habitat model developed using data collected from 104 satellite-tagged blue whales to determine likely whale locations. That information is sent to satellites that downlink it to ships in the channel to help them avoid whale collisions (Figure 3).
The acoustic buoys use a hydrophone (underwater microphone) and an integrated computer to detect and identify sounds in the channel. The resulting audio data is transmitted via satellite to scientists who confirm if any of the sounds are whale calls and what type of whale produced the calls. Because sounds travel differently as conditions change underwater and whales communicate at different pitches and volumes, the sounds detected from the buoy only indicate that one or more whales are in the vicinity. It can’t determine how many whales are present, how far away they are, or in what direction they are from the buoy.
The coarse location information from the buoys helps to inform and direct the activities of naturalists and guides on ships and survey aircraft in the channel when they are looking for whales. When whales are sighted, observers record the date, time location, their number, and species. The Whale Alert and Spotter Pro mobile apps are used to transmit the observations to a database. The database also includes data collected during monthly aerial surveys of the Santa Barbara Channel shipping lanes. Each survey flight consists of two transects that cover approximately 180 nautical miles.
Given the imprecise nature of the audio and visual data, a whale habitat model has been developed to predict the likely locations of whales in the channel. The habitat model combines the audio and visual data with the current oceanographic conditions and uses AI/ML to predict the habitat suitability (0-100 percent) and estimate the likelihood that whales are present in a series of 10 x 10 km grid cells for the Santa Barbara Channel region. Parameters in the habitat model include salinity, chlorophyll-a concentrations, sea surface temperature, and other water column properties (measured by another series of buoys in the channel). When the model identifies a high probability of whales, it issues a warning to ships in the corresponding grid cells, enabling the ships to optimize their resources to avoid whale collisions.
An Emergency Position Indicating Radio Beacon (EPIRB) is a device to alert search and rescue services (SARs) in case of an emergency at sea. EPIRBs are supported by the S-AIS system that combines AIS devices and a series of satellites. Sensor fusion to support the current EPIRB system includes ground-based and space-based elements (Figure 4), including:
- The EPIRB that is activated in a life-threatening situation.
- SAR repeaters (SARRs) and SAR signal processors (SARPs) in orbiting satellites. A SARR or SARP is a secondary payload carried by several types of satellites;
- Five satellites in polar low-altitude Earth orbits,
- Eleven satellites in geostationary Earth orbits, and
- 46 satellites in medium-altitude Earth orbits.
- Satellite downlink receiving and signal processing ground-based terminals.
- Mission control centers (MCCs) that distribute beacon location data to rescue coordination centers (RCCs).
- RCCs to coordinate the actual rescue missions.
Traditional terrestrial-based coastal AIS provides coverage only up to 50 nautical miles offshore. S-AIS provides global coverage. S-AIS also supports correlation with independent sensor systems such as shipboard radars, optical imaging satellites, and other SAR-related tools. Correlating various data sources provides the ability to identify all the vessels in need and helps optimize rescue efforts.
The range of sensors used in maritime applications is wider and more complex than most terrestrial systems. Sensors and sensor fusion are used across a range of maritime activities, from ship location to saving whales and the lives of seafarers. In the future, sensor fusion is expected to enable the development of autonomous containers and other ships, saving money and reducing shipping costs.
Artificial Intelligence / Machine Learning Sensor Fusion for Autonomous Vessel Navigation, IEEE Transactions on Intelligent Transportation Systems
Automatic Identification System, Department of Homeland Security
Automatic Identification System, Wikipedia
Maritime AI-NAV, Finnish Geospatial Research Institute
Unmanned Ships: Navigation and More, Gyroscopy and Navigation
Whale Safe Methodology, Whale Safe