Importance of Artificial Intelligence Chip
Artificial intelligence (AI) is making its way into almost sphere. The day is not far when the technology will rule the entire world. AI finds application in medical diagnosis, surgeries, self-driving cars, smart homes and gadgets, identification of criminals, making trading decisions, the creation of drugs, water quality monitoring, and a lot more. This rapid evolution of the technology brings greater opportunities for semiconductor companies to create AI chips that can operate in a variety of systems. AI chips are nothing but specially fabricated silicon chips with integrated AI technology which use machine learning and deep learning algorithms to perform computations. Some of the AI-optimized chipset architectures are GPU (Graphical Processing Unit), ASIC (Application-Specific Integrated Circuit), NNPU (Neural Network Processing Unit), FPGA (Field-Programmable Gate Array), and CPU (Central Processing Unit). The reason why AI chips are used is that they meet the demands of machine learning (ML) – a branch of AI that provides systems the ability to learn from data and experience. The chips can perform calculations much faster than regular CPU’s and make better decisions about issues.
Given the benefits of AI chips, there are significant growth opportunities for the players in the ecosystem and the market. Driving the AI chips market are the factors such as the need for more efficient systems for solving computational problems, increased demand for smart homes and smart cities, a rise in investments in AI startups, and the emergence of quantum computing and autonomous robotics. As per the report by Allied Market Research, the global artificial intelligence chip market garnered $4,515 million in 2017 and is likely to accrue a sum of $91,185 million by 2025, growing at a CAGR of 45.4% from 2018 to 2025.
Over the years, a slew of chip companies has been focusing on AI investments and developing new AI chips to have their impact in the competitive industry. They made huge investments in AI through acquisitions, launches, and formation of startups over the last few years, along with carrying out their own research and development. For instance, Alibaba Group, the Chinese e-commerce giant recently announced its plans to launch its own AI chip in the second half of 2019 to support its cloud and Internet of Things (IoT) businesses. The Chinese networking and telecommunications company, Huawei Technologies Co., Ltd. announced two AI-optimized chips as it looks to reduce its reliance on foreign microprocessors. The new chips namely Ascend 910 and Ascend 310 are designed to be used in public and private cloud services, IoT devices, edge computing, and consumer devices. Google, an American technology company launched the third version of its AI chips to enhance artificial intelligence-based applications. Alibaba completed the acquisition of Chinese AI chipmaker, C-SKY in April 2018.
AI Chip by Alibaba Group: The new AI chip called AliNPU is set to be released by Alibaba in 2019. The solution is said to have the potential to support technologies used in self driving, smart cities, and smart logistics and would be designed to perform AI specific jobs such as image and video analysis. The company also established a new semiconductor company called Pingtouge which deals with customized AI chips and embedded processors. These developments would lead the company to meet its goals of expanding its cloud and Internet of Things (IoT) businesses and facilitate the development of applications suitable to the industry. Alibaba CTO Jeff Zhang said, “We are confident our advantages in the algorithm, data intelligence, computing power, and domain knowledge on the back of Alibaba’s diverse ecosystem will put us at a unique position to lead real technology breakthroughs in disruptive areas, such as quantum and chip technology.”
Huawei Strengthens its AI Portfolio: In October ’18, two AI-powered chips namely, Ascend 310 and Ascend 910, were introduced by Chinese major phone manufacturer, Huawei at the Huawei Connect 2018 in Shanghai. The aim of the move is to extend the company’s existing business, lessen its dependence on foreign manufacturers of AI chipsets, and drive its growth in the succeeding years. The Ascend 910 offers the greatest computing density in a single chip and is expected to be used at data centers where huge data is processed. Supporting pervasive intelligence for a fully connected and intelligent world, the Ascend 310 is expected to be used in internet-connected devices such as smartwatches and smartphones using AI technology.
Alibaba Scoops Up C-SKY Microsystems: Alibaba acquired Hangzhou C-SKY Microsystems, a leading Chinese supplier of embedded CPU cores to enhance its cloud-based IoT business. The acquisition is a crucial step for the former as the company looks to empower various industries via cloud-based IoT solutions, where chips play a vital role. According to the Alibaba, the purchase of C-SKY underlines their goal to fueling the growth of the chip market.
Google Unveils Third Version of its AI Chips: In May ‘18, Google’s third generation of special chips for artificial intelligence – Tensor Processing Unit (TPU) was launched. Installed in its data centers and increasingly available over its cloud, the new generation of chips would be aggregated into machine learning pods that support over a hundred thousand trillion floating-point operations per second or 100 petaflops, which is well over eight times more powerful than the earlier version of the pod. The chips represent an alternative to Nvidia’s graphics processing units and enable Google to improve its AI-based tasks including identifying words from speeches in audio recordings, locating objects in images and videos, and picking up underlying emotions in written text. According to Sundar Pichai, Chief Executive Officer (CEO) at Google, the ability of the chip to perform enormous computing is possible when large clusters of the third-generation TPUs or pods are employed. The third-generation chips also generate an abundance of heat while training and inferencing. For this reason, Google had to launch the concept of liquid cooling to prevent the TPUs from catching fire. The hoses on the board carry water to the accelerators, which earlier had heat whisked away by towering heat sinks. According to the company, this is the first use of liquid cooling in data centers.
Also Read: Applications of Machine Learning