NRL Tests Autonomous Multi-Sensor Motion-Tracking and Interrogation System

The Naval Research Laboratory (NRL) and the Space Dynamics Laboratory (SDL) have been working with the support of the Office of Naval Research (ONR) on an autonomous multi-sensor motion-tracking and interrogation system.

This system reduces the workload for analysts by automatically finding moving objects, then presenting high-resolution images of those objects with no human input.

Intelligence, surveillance and reconnaissance or ISR assets tend to generate huge amounts of data that can overwhelm human handlers. This can restrict the ability of an analyst to generate fast and complete intelligence reports as needed in real time operations. This is where the new system being developed and tested will come in handy.

Dr. Brian Daniel, research physicist, NRL ISR Systems and Processing Section said that these tests display how a single imaging sensor can be used to provide imagery of multiple tracked objects, a job typically requiring multiple sensors.

The interrogation sensor was the precision, jitter-stabilized EyePod developed under the Fusion, Exploitation, Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program. EyePod is a dual-band visible-near infrared and long-wave infrared sensor mounted inside a nine-inch gimbal pod assembly designed for small UAV platforms.

The mid-wave infrared nighttime WAPSS (N-WAPSS) was chosen as the wide-area sensor, and has a 16 mega-pixel, large format camera that captures single frames at four hertz (cycles per second) and has a step-stare capability with a one hertz refresh rate.

Dr. Michael Duncan, ONR program manager said that the demonstration was a complete success. Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but they also showed an ability to follow smaller human-sized objects under specialized conditions he added.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Choi, Andy. (2019, February 24). NRL Tests Autonomous Multi-Sensor Motion-Tracking and Interrogation System. AZoSensors. Retrieved on November 22, 2024 from https://www.azosensors.com/news.aspx?newsID=3152.

  • MLA

    Choi, Andy. "NRL Tests Autonomous Multi-Sensor Motion-Tracking and Interrogation System". AZoSensors. 22 November 2024. <https://www.azosensors.com/news.aspx?newsID=3152>.

  • Chicago

    Choi, Andy. "NRL Tests Autonomous Multi-Sensor Motion-Tracking and Interrogation System". AZoSensors. https://www.azosensors.com/news.aspx?newsID=3152. (accessed November 22, 2024).

  • Harvard

    Choi, Andy. 2019. NRL Tests Autonomous Multi-Sensor Motion-Tracking and Interrogation System. AZoSensors, viewed 22 November 2024, https://www.azosensors.com/news.aspx?newsID=3152.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.