White Paper

Embedded Cognitive Computing and Artificial Intelligence for Military Applications (part 1)

Issue link: http://read.uberflip.com/i/1173432

Contents of this Issue

Navigation

Page 2 of 7

3 Converging capabilities created AI For decades, multiple dominant digital technology trends have been converging and now they have come to an apex. The on-going veracity of Moore's Law, big data and cloud processing, the internet of things and the technology achievements coming from the development of smart, autonomous vehicles have collectively converged and produced the environment for AI and machine enabled computing to come of age. Moore's law and algorithms Gordon Moore, Intel's co-founder, noticed a trend regarding the number of transistors per processor throughout the early years of digital computing. He observed that the number of transistors in each new generation of processing devices was doubling each year (eventually this was extended to two years). Whenever a technology grows greater than a linear pace invariably indicates that something noteworthy is happening. While the trend was a point-in-time comment made by Moore, others continued to track his observation naming the principle after him - Moore's Law. Historically, scientific and financial algorithms and big data processing problems in general have always needed more processing capability than was available. The algorithms that seem most useful always need tomorrow's compute power to provide the processing, required within a reasonable amount of time. Recently, the hardware tortoise has begun to catch the software hare. The latest version of Microsoft Office can run on an older PC, a 10-year old 256GB hard drive supports contemporary cloud based applications and there are more cores on a chip than most applications can keep busy. Offline processing tasks are becoming real-time and image processing is becoming video processing. Algorithms and programming techniques that were too inefficient in the past to run on the hardware of the day in a reasonable amount of time are running well on current hardware. Today's AI algorithms have intersected with current processing capability predicted by Moore's Law and are achieving "meaningful results in a useful time". The cloud computing mega-trend AI and machine learning have overtaken big data and cloud computing and storage as the buzzwords in Silicon Valley. As storage volume and performance have increased and the associated costs have decreased, all types of organizations are capturing as much data as they can, about everything. All are aware that deep inside that data is information about why their clients are buying things (or not), or why their employees are happy (or not), what information people are viewing on web pages – data about virtually everything is being collected and stored. Swarming "drones". Unmanned platforms in the air, land and sea platforms will increasingly use on-platform and big processing derived AI capabilities to execute autonomous missions collectively and independent of direct human oversight. Cloud companies including Dropbox, Microsoft Azure, Amazon AWS and others offer places to store data safely offsite, data that is accessible from anywhere in the world. These data center-powered clouds offer virtual big data processing capabilities to deliver high performance analytics with ease. The ability for so many to access these kinds of computing and storage capabilities was unimaginable a few years ago. Today, it is available to all with a low-cost subscription. This has encouraged the obsession of keeping any data that could be useful, even if in most cases no one knows exactly what to do with it. So far, the result has largely been the creation of more bits than anyone can make bytes of. In 2012, there was approximately 2.8 zettabytes (ZB) of global data, with less than 1 percent of it actually being subject to any kind of analysis. By 2020, the amount of stored data is expected to reach 44ZB. Revelations are patiently waiting in this stored data for future AI applications to unlock and exploit. Internet (as we know it) Industry 4.0 Internet of Everything (IoE) Internet of Things (IoT) Industrial Internet M2M Scope Reach Web of Things Machines Virtual World Physical World Objects/ Devices People World Beyond Connectivity: Takes idea of industrial internet further to a "computerization of the manufacturing industry" "Bringing together people, process, data, and things to make networked connections more relevant and valuable than ever before" "Physical objects are linked through wired and wireless networks" "Integration of complex physical machinery with networked sensors and software" "Technologies that allow both wireless and wired systems to communicate with other devices of the same type" (what is being altered by the concept) (who/what is impacted by the concept) Includes concepts such as 3D-printing, augmented reality "A set of software architectural styles and programming patterns that allow real-world objects to be part of the world wide web" Comparison of IoT, internet of everything, machine-to-machine and other ecosystems

Articles in this issue

view archives of White Paper - Embedded Cognitive Computing and Artificial Intelligence for Military Applications (part 1)