INTEL THROWS DOWN A...
 
Notifications
Clear all

INTEL THROWS DOWN AI GAUNTLET WITH NEURAL NETWORK CHIPS

1 Posts
1 Users
0 Reactions
146 Views
Avatar Of Flashgordon
Posts: 545
Topic starter
(@flashgordon)
First Mate
Joined: 55 years ago

https://www.nextplatform.com/2019/11/13/intel-throws-down-ai-gauntlet-with-neural-network-chips/

"At this year’s Intel AI Summit, the chipmaker demonstrated its first-generation Neural Network Processors (NNP): NNP-T for training and NNP-I for inference. Both product lines are now in production and are being delivered to initial customers, two of which, Facebook and Baidu"

"The purpose-built NNP devices represent Intel’s deepest thrust into the AI market thus far, challenging Nvidia, AMD, and an array of startups aimed at customers who are deploying specialized silicon for artificial intelligence."

"Rao told the audience that although the AI market is not monolithic, requiring a variety of solutions based on different performance requirements and business demands, there is also a critical need for purpose-built AI processors at the high-end. . . . Rao points to the increasing complexity of neural network models, which, based on the number of parameters, is growing about 10X per year."

"The inference line, NNP-I, is was not even envisioned three years ago . . . Now, of course, inference is universally recognized as a distinct type of workload, with its own special needs for low latency, power efficiency, and specialized math."

"The first-generation inference processor, the NNP-I 1000, is implemented in Intel’s 10 nanometer processes, which depending on the SKU, draws between 10 watts and 50 watts. It is comprised of 12 inference compute engines and two IA CPU cores hooked together with a cache coherent interconnect. It can perform mixed precision math, with a special emphasis on low-precision computations using INT8."

" “we can run larger models, more complex models, and run dozens of them in parallel.”"

"Facebook’s AI director, Misha Smelyanskiy, joined Rao, explaining that its Glow machine learning compiler has been ported to the NNP-I hardware, the implication being that the social media giant has begun to install these devices in at least some of its datacenters . . . but did mention some key inference applications that could be served by the new hardware, including photo tagging, language translation, content recommendation, and spam and fake account detection. "

"Rao thinks the largest models today, which contain up to 100 billion parameters, represent something of an inflection point for the industry. At this level, these models are starting to do more than just extract useful information from data; they can now begin to understand that data well enough to turn it into knowledge. According to him, this means that information will have to be applied to past experience, and in that context, drive action. Which sounds suspiciously similar to what humans do. As he admitted though, the human brain has to deal with between 3 trillion and 500 trillion parameters and does this with just 20 watts. “Today, we’re just really scratching the surface,” said Rao."

Share: