In this week’s Voices of the Industry column, Marc Cram, Director of Sales for Server Technology, explores applications for the evolving world of AI, including a variety of software tools designed to find hidden patterns and correlations between elements of large data sets.
You may have heard the term “artificial intelligence” mentioned in the press in the past year. It has been a topic of much conversation regarding the potential impact to both the job market and the people who are seeking employment. When you hear the term AI, do you think of HAL from 2001 A Space Odyssey? Or maybe the Cyberdyne Systems Model 101 Series 800 (the Terminator)? Or does something more benign, like Apple’s Siri Voice Assistant come to mind?
According to dictionary.com, one definition for AI is “the capacity of a computer to perform operations analogous to learning and decision making in humans, as by an expert system, a program for CAD or CAM, or a program for the perception and recognition of shapes in computer vision systems.”
Originally proposed as an “Imitation Game,” the Turing Test, as described by Alan Turing, is a test of a machine’s ability to exhibit intelligent behavior through conversation equivalent to, or indistinguishable from that of a human. In the game, a human being and a computer would be interrogated under conditions where the interrogator would not know which was which, the communications being entirely by textual messages. Turing argued that if the interrogator could not distinguish between them by questioning, then it would be unreasonable not to call the computer intelligent, because we judge other people’s intelligence from external observation in just this way.
By 2020, 30% of data centers that fail to implement AI and machine learning will cease to be operationally & economically viable. — Gartner, Dec 2017
Commonly used terms for describing various types of artificial intelligence include machine learning (both supervised and unsupervised), expert systems, knowledge-based systems, neural networks, fuzzy logic, genetic algorithms, case-based reasoning, natural-language processing (NLP), and intelligent agents. High profile applications of AI include Siri, Google Now, Alexa, Cortana, driverless vehicles, and a wide variety of software tools designed to find hidden patterns and correlations between elements of large data sets.
In the near term, a lot of new infrastructure in the data center is being devoted to enabling AI software applications to run. NVIDIA GPUs are powering many of these installations, but there are numerous other platforms coming to challenge NVIDIA’s supremacy in the field. At the August 2017 Hot Chips conference, Amazon, Baidu, and Microsoft all detailed projected that utilize Field Programmable Gate Arrays (FPGAs) as computational accelerators for machine learning applications. Baidu detailed the XPU that targets compute-intensive, rule-based workloads. Microsoft’s Project Brainwave, a scalable acceleration platform for deep learning, provides real-time responses for cloud-based AI services. Amazon announced a Xilinx-based FPGA application that is accessible through EC2. For its part, Intel acquired Altera, an FPGA manufacturer that competes with Xilinx. Intel is touting their FPGAs as being well-suited for AI applications.
[clickToTweet tweet=”Gartner – By 2020, 30% of data centers that fail to implement AI and machine learning will cease to be operationally & economically viable. #AI” quote=”Gartner – By 2020, 30% of data centers that fail to implement AI and machine learning will cease to be operationally & economically viable. #AI”]
In 2016, IDC forecasted that by 2020, cognitive systems and artificial intelligence would be adopted across a broad range of industries and drive worldwide revenues from about $8B in 2016 to more than $47B in 2020 for a CAGR of 55.1%. Hearing the siren-song of opportunity, numerous venture capital funds have invested in countless AI-related startups. Companies worldwide have begun embedding and deploying AI into almost every kind of enterprise application or process. Google’s DeepMind AI was turned inward to look at their data center operations, and reduced their cooling bill by 40%. Google says that customers running in the Google cloud will improve their own energy efficiency.
Future applications for AI may include processing IoT data for Smart Cities, providing first and second level technical support for products, research and development of new genetic therapies, supervising the elderly that live at home alone, and lowering the cost of insurance by more accurately diagnosing illnesses and their causes. The future of AI is bright indeed.