AI Chips Power into the Data Center, Client Device
Post# of 301275
MOUNTAIN VIEW, Calif., Oct. 17, 2017 (GLOBE NEWSWIRE) -- Artificial intelligence (AI) processors, an essential technology for self-driving cars, are emerging as a new technology driver in data centers, client devices such as smartphones, and embedded (IoT) systems. A new report from The Linley Group, “A Guide to Processors for Deep Learning,” analyzes deep-learning accelerators and IP cores for artificial intelligence, neural networks, and vision processing for inference and training.
The AI chip market has seen rapid changes and advancements in recent years as traditional CPUs are giving way to more specialized hardware architectures that provide the performance needed by the latest deep-learning applications. The report highlights the battle for market dominance between Nvidia and Intel who are investing heavily in new processors for this market, focusing on both the data center and autonomous cars. The report also analyzes licensable IP cores for ASIC and SoC development as well as FPGAs for deep learning. It covers a number of startups – some well-funded – that are developing new, more customized architectures to support deep learning. The report includes a market forecast for automotive, data-center, and client adoption of deep-learning accelerators.
“Deep-learning applications have required the development of far more powerful processors,” said Linley Gwennap, principal analyst with The Linley Group. “Even the fastest CPUs weren’t up to the challenge of running highly complex neural networks. GPU vendors are now competing against new hardware approaches involving DSPs, FPGAs, and dedicated ASICs. And we’re finding that some data-center operators such as Google and Microsoft are developing their own hardware accelerators.”
The report provides detailed technical coverage of announced deep-learning chip products from AMD, Intel (including former Altera, Mobileye, Movidius, and Nervana technologies), NXP, Nvidia (including Tegra and Tesla), Qualcomm, Wave Computing, and Xilinx. It also covers IP cores from AImotive, ARM, Cadence, Ceva, Imagination, Synopsys, and VeriSilicon. A special chapter covers Google’s TPU and TPU2 ASICs. Technical comparisons are made in each product category along with analysis and conclusions about this emerging market.
Availability
“A Guide to Processors for Deep Learning” is currently available directly from The Linley Group. For further details, including pricing, visit the web site at http://www.linleygroup.com/report_detail.php?num=65
About The Linley Group
The Linley Group is the industry's leading source for independent technology analysis of semiconductors for networking, communications, mobile, and data-center applications. The company provides strategic consulting services, in-depth analytical reports, and conferences focused on advanced technologies for chip and system design. The Linley Group also publishes the weekly Microprocessor Report. For insights on recent industry news, subscribe to the company's free email newsletter: Linley Newsletter.
Company Contact: Linley Gwennap Principal analyst The Linley Group 650-962-9380 www.linleygroup.com