White Paper

8101.02E_generational_leap_edge_computing_versal_ACAP_08112022

Issue link: https://read.uberflip.com/i/1476560

Contents of this Issue

Navigation

Page 1 of 7

WHITE PAPER A Generational Leap in Edge Computing with the Versal® ACAP mrcy.com 2 mrcy.com 2 As sensors collect ever-increasing volumes of data, the expectations on edge processing technology increase. The next generation of processing components must not only be small and rugged, able to function in a fighter jet or on a Humvee, but they must also deliver enough processing power to enable sophisticated AI-based applications like image recognition and cognitive EW. For tunately, a new type of semiconductor architecture has emerged to meet this challenge, the Adaptive Compute Acceleration Platform (ACAP) from Xilinx ®. This paper discusses how ACAP processors function, where they fit and why they will drive a generational leap in edge computing capabilities for defense applications. EMERGING REQUIREMENTS AT THE EDGE Sensor Data Volumes Are Exploding Mission-critical defense systems now employ countless sensors. Many collect imaging data, centered on the visual spectrum but ex tending into the infrared and ultraviolet ranges. Others track electromagnetic communications or radar input ranging from traditional frequencies through millimeter wave (100 MHz through 50 GHz) levels. Still others monitor physical vibrations for sonar or voice recognition. In ever y one of these areas, applications are demanding expanded capability—more detailed images, the ability to track more targets and a comprehensive view of the radio spectrum. To meet these requirements, new sensors are operating with greater precision (bit-depth) and speed (data rate), generating an expanded data stream from each node. When that effect is combined with a continual increase in the number of deployed sensors, the result is a geometric increase in sensor data volumes, to the point where some individual systems must deal with terabytes/second of input. Response Times Are Shrinking In parallel to exploding data volumes, today 's real-time response requirements are increasingly tight. In some Victory on the twenty-first century battlefield now requires processing vast quantities of information in real time. More detailed imagery is needed to enable better decision-making in tactical command centers, radar tracking must be able to monitor more targets across expanded distances and EW systems will need to deal with an increasingly complex range of waveforms generated by clever adversaries. These advanced applications, and others, are driving sensor and processing innovations. cases, application latency must shrink to deal with new threats like hypersonic missiles that are moving faster than ballistic missiles and can adjust their trajectories. In other cases, shor ter response times are dictated by the increasing sophistication of our adversaries as they field systems with stealthy, high-frequency radars using pulse widths lasting mere nanoseconds. Sophisticated Applications Now Use Multiple Sensors Further requirements are generated by the increasing use of sensor fusion in defense applications. These applications combine the data from multiple sensor types in real-time to achieve a more detailed view of a platform's surrounding environment or more accurate detection of interest targets. Examples include autonomous vehicle control, which uses input from optical cameras, lidar and road surface sensors, and imaging systems that fuse data from optical, infrared, lidar and radar sensors to create clear views under any conditions.

Articles in this issue

Links on this page

view archives of White Paper - 8101.02E_generational_leap_edge_computing_versal_ACAP_08112022