See something, do something: Sensors in battlefield dominance

By bringing together compute power and sensor technology, warfighters can process, analyze and collaborate to make decisions in real time and take advantage of AI and machine learning models.

Two recent technological shifts have improved situational awareness on the battlefield. The first is bringing compute power to the edge, which reduces the time needed to process data. The second is leveraging sensor fusion, which enables the gathering and integration of a much broader set of information. The outcome is that together, these changes have empowered the military to respond to real-time events more quickly and make better informed decisions. With cloud technology and faster memory architecture becoming the norm, it’s all about bringing together the strategic and tactical details for rapid time to insight.

20th century technology in a 21st century world

Military units in the filed might be working with 20-year-old storage, networking and compute technology that must process data in stages to deliver insights. For example, a photo, video or audio is sent to the cloud or back-end datacenter infrastructure. Then analysts process and review the data, synthesize results and outcomes and then send information back to the warfighter.  The more data moves around and the more reliance on outside factors, the greater the delay.

A far better path to success is getting updated technology in-theater. Getting the freshest data with real-time analysis faster than the adversary will deliver the advantage. In the 21st century battlefield, actionable intelligence is available because compute power is right where data is collected, and warfighters can process, analyze and correlate information as this collection occurs. Machine learning and artificial intelligence power this ability to correlate and predict.

Today's military must be skilled at building models that can be rapidly deployed and include data scientists and engineers working together to create an environment where rapid innovation occurs.

As more sensors come online and more data becomes available, we will see higher quality and fidelity and will be able to use different kinds of AI to anticipate the adversary's next moves.  Prediction and prescription are game changers that will reduce casualties and accomplish mission goals.

Sensor fusion and data fusion

Currently, sensor data comes in various types, formats and sizes, including GPS, low- or high-definition media, or time and date metadata. It is all being collected and stored in warehouses. Unfortunately, these various formats are often difficult to mine.

Fusing sensor data from various collections fills in the gaps from the individual sensors. The metadata enhances the analysis and context for the data and the combination delivers insight and actionable intelligence that would not be otherwise possible. Cameras or sensors located near an edge computing device improve situational awareness because data inferencing can take place right where and when warfighters need it.  

For example, it is one thing to know that an enemy unit is nearby, but another to know where and when that unit is moving, what weapons it has, and what can be done to achieve battlefield dominance. With knowledge of enemy units and their behaviors, warfighters would have reliable insights into the biggest and most immediate threats.

By bringing together compute power and sensor technology, warfighters can process, analyze and collaborate to make decisions in real time and take advantage of AI and machine learning models.