The common perception of Artificial Intelligence (AI) is in the form of super-smart humanoid robots. The reality though is little less dramatic. Majority of the AI applied today in real life problems are smart algorithms that run in the background to produce data set automatically. They, silently, at the background increasingly effecting our daily lives. Speech recognition, image recognition, analysis of behavioral pattern (e.g. for fraud detection), targeted adverts, recommendation for shopping, algorithmic trading are few examples of applied AI. In recent time, we have noticed AI boom primarily led by the internet giants; Google, Facebook, Amazon, Microsoft, Baidu and others. Recent resurgence of startups is seen in the AI space.
AI and Automation featured prominently on Gartner’s hype cycle 2015 with Autonomous Vehicles put at the “Peak of Inflated Expectations”. Garner also defined “Autonomous” as the sixth and final stage of an organization’s journey towards being a truly “digital enterprise”. It highlighted a range of technologies as emerging technologies. Some of the key ones related to AI are: Autonomous Vehicles, Bioacoustics Sensing, Brain-Computer Interface, Digital Dexterity, Human Augmentation, Machine Learning, Neurobusiness, People-literate technology, Quantum Computing, Smart Advisors, Smart Dust, Smart Robots, Virtual Personal Assistant, Virtual Reality and Volumetric and Holographic Displays. Some of the technologies mentioned in the list are already existing at various stages of maturity. Some are in research labs and some are being conceptualized. But the progress is certainly happening and it’s just a matter of time when all of these will be reality.
Historically, it was seen that hype is followed by disillusionment and it had happened to AI before. So, what are the changes that made the resurgence of AI and gives us confidence that the hype, this time, has good probability to be realized? Major improvements and innovation in the area of enabling technologies are the reasons and the economic pressure is also a major driver for acceptance of more automation. In this article, we will look into the technological factors.
Last time AI hype failed due to non-availability of the three key ingredients for its success:
- Processing power
- Storage capacity
- Availability of data
Processing Power: Processing power so far has followed Moor’s Law. Although, the number of transistors per chip is still rising (currently at 14nm feature size for transistor), heat and power consumption had inhibited the improvement of clock speed and single-threaded performance. Intel’s approach was to create multiple CPU cores, the design principle which was followed by specialized, massively parallel graphics processors (GPU) design by Nvidia. It turned out that GPU acceleration is highly suitable for training neural networks running machine learning algorithm like Deep Learning. This is because, sample images that are required to train the neural network for solving classification problems can be represented as matrix and hence can be processed in parallel with multiple-cores of GPU. A cluster of GPUs increases the computing power massively and hence boost performance massively. This has been a big bottleneck in the earlier generation of AI, particularly machine learning using neural network. Also, the significant improvement and sophistication of the associated toolchains had made development easy. For example, Nvidia Tegra K1 SoC contains a 192-CUDA-core-Kepler GPU and a ‘4-plus-1’ quad-core ARM Cortex A15 CPU. It comes with a library of primitives for deep learning neural network known as cuDNN.
The most ambitious neural network project is IBM SyNAPSE chip. It is a departure from traditional “Von Neumann” machine architecture and based on the principles of emulating human brain in hardware. The project name was “True North”. Build from 5.4 billion transistors whose feature size is 28nm, the SyNAPSE chip has 4096 cores organized into 64-by-64 array supported by meshed network. It contains 1 million programmable neurons and 256 million programmable synapses. The computation, memory and communication is tightly integrated and the chip works on event-driven instruction set. It consumes far less power than conventional Von Neumann design chips. IBM claims that the current generation SyNAPSE consumes only 70mW power in real-time operations. According to Dr. Dhamendra Modha, IBM’s Chief Scientist for Brain-inspired Computing and the head of project True North, IBM’s goal is to build neuro-synaptic chip with 10 billion neuron and 100 trillion synapses.
Storage capacity: Availability of plentiful cheap storage is also a driver for success of AI. The equivalent to Moore’s Law in storage is known as “Kryder’s Law”. It postulated in 2005 that the areal density of hard disk drives would increase by greater than double in every two years. Till now, the postulate held true and as a result, so much of storage is now available in low cost.
Availability of Data: One of the key requirement for AI algorithm to perform better is to learn from existing data. Machine Learning is the branch of AI that studies it. Machine Learning algorithm (both supervised and un-supervised) requires data and lots of them to improve the prediction or classification error and thereby learn to correctly predict and/or classify unseen data. With the penetration of internet enabled services and the huge adoption of cloud services, the amount of available data has increased exponentially and hence is used by AI algorithm to successfully perform their tasks e.g. train deep neural networks.
So, having established the fact that this generation of AI is here to stay and change our lives, let us look into few business areas where AI is likely to have significant impact. In the McKinsey Report published on April, 2017, titled “Smartening up with Artificial Intelligence (AI)”, the following predictions were stated:
Products and Services
- Highly autonomous vehicles are expected to make up 10 to 15% of global car sales in 2030 with expected two-digit annual growth rates by 2040. The efficient, reliable, and integrated data processing that these cars require can only be realized with AI.
- Predictive maintenance enhanced by AI allows for better prediction and avoidance of machine failure by combining data from advanced Internet of Things (IoT) sensors and maintenance logs as well as external sources. Asset productivity increases of up to 20% are possible, and overall maintenance costs may be reduced by up to 10%.
- Collaborative and context-aware robots will improve production throughput based on AI enabled human-machine interaction in labor-intensive settings. Thereby, productivity increases of up to 20% are feasible for certain tasks – even when tasks are not fully automatable.
- Yield enhancement in manufacturing powered by AI will result in decreased scrap rates and testing costs by linking thousands of variables across machinery groups and sub-processes. E.g., in the semiconductor industry, the use of AI can lead to a reduction in yield detraction by up to 30%.
- Automated quality testing can be realized using AI. By employing advanced image recognition techniques for visual inspection and fault detection, productivity increases of up to 50% are possible. Specifically, AI-based visual inspection based on image recognition may increase defect detection rates by up to 90% as compared to human inspection.
- AI-enhanced supply chain management greatly improves forecasting accuracy while simultaneously increasing granularity and optimizing stock replenishment. Reductions between 20 and 50% in forecasting errors are feasible. Lost sales due to products not being available can be reduced by up to 65% and inventory reductions of 20 to 50% are achievable.
- The application of machine learning to enable high-performance R&D projects has large potential. R&D cost reductions of 10 to 15% and time-to-market improvements of up to 10% are expected.
- Business support function automation will ensure improvements in both process quality and efficiency. Automation rates of 30% are possible across functions. For the specific example of IT service desks, automation rates of 90% are expected.
AI, primarily in the form of deep learning running on powerful GPUs is already adopted by the enterprises. Moving forward, the enterprises will become more data driven and the demand will force the increase of sophistication of the algorithms.
Opinion is currently divided about the impact AI would have on job, economy and work place culture. But what is certain is that AI would continue to develop at rapid pace. The possible economic and social fallout needs to be addressed by the political and governmental strategies.