White Paper

enabling-big-data-processing-and-ai-powered-everything-everywhere

Issue link: https://read.uberflip.com/i/1476467

Contents of this Issue

Navigation

Page 2 of 5

WHITE PAPER Enabling big data processing and AI-powered everything, everywhere mrcy.com 3 TECHNOLOGY INNOVATION Since the early 1990s and Secretary of Defense William Perry 's watershed the commercial off the shelf (COTS) initiative, the Department of Defense (DoD) has been seeking to make the best commercial technology ready for defense and aerospace applications. For computing, this has meant miniaturizing processing systems, making them rugged for deployment, deterministic for safe effector control, secure for IP protection, and building in demonstrable trust for vulnerability mitigation. Modern aerospace and defense electronics manufacturers have evolved into efficient enterprises that use a commercial approach to run and invest in their businesses, enabling them to access the best commercial technology and make it ready for deployment within aerospace and defense platforms. The same approach is now enabling the data center to migrate across smarter fog and edge layers. The F-35 is an information fusion engine with massive on-platform processing Pioneered by Mercury Systems, the contemporary approach to migrating/moving the data center uses a hybrid commercial business model called the next-generation defense electronics business model (N-DEB). The model uses sustained, focused IRAD investments to create the technologies needed to infuse the levels of environmental protection, miniaturization, cooling, security and determinism required for deployment in modern aerospace and defense processing applications. The resulting, and now proven, technologies are enabling the most powerful data center processors to be embedded into physically constrained edge applications located in the harshest environments, opening the door to fog and edge computing everywhere. DEPLOYING INTEL® XEON® SCALABLE PROCESSORS AND NVIDIA GPUS ACROSS THE FOG AND EDGE LAYERS Intel's Xeon Scalable processors with on-die AI accelerators and, NVIDIA GPUs, are the gold standard in big data and AI processing. They, and other powerful processors, require advanced packaging, miniaturization, cooling and unrestricted I/O pipes to be successfully deployed at and near the edge. Mercury has developed proven capabilities that efficiently address these deployment requirements: Rugged packaging – The most powerful processors and GPUs are designed for use in benign data center environments. Each is retained to their mounting substrate via the processor 's retaining clips or via a PCIe line card. Easy to implement, this approach is physically weak, requiring little disturbance to disrupt I/O connectivity and introduce errors. Mercury does not use a retaining clip. Instead, the thousands of I/O connections are reflowed (soldered) to the VPX PCB and the processor is underfilled with epoxy. This enables the resulting package to withstand the harshest environments and conditions. Intel Xeon Scalable processors and NVIDIA GPUs in various packaging options, including rugged, composable rackmount, small form factor, and Open VPX and custom modules Miniaturization – System-in-package, wafer-stacking, 2.5D and 3D fabrication techniques are used to shrink the data center server from a 19-inch footprint into other form factors with varying degrees of size, weight and power (SWaP) performance. Form factors vary from rugged, reduced-profile composable servers used for fog and edge processing to dense, defense-grade OpenVPX form factor boards (OpenVPX is the de facto open system architecture for embedded computing), with the latter representing over a 90% reduction in server volume.

Articles in this issue

Links on this page

view archives of White Paper - enabling-big-data-processing-and-ai-powered-everything-everywhere