Hardware accelerators designed from the ground up to address the needs of deep learning training and inference separately, and at scale. Built for the future, not from the past.
Built solely to train deep learning models at lightning speed, the Intel® Nervana™ Neural Network Processor-L 1000 put a large amount of HBM memory and local SRAM much closer to where compute actually happens. This means more of the model parameters can be stored on-die to save significant power for an increase in performance. The Intel® Nervana™ Neural Network Processor features high-speed on- and off-chip interconnects enabling multiple processors to connect card to card and chassis to chassis, acting almost as one efficient chip and scale to accommodate larger models for deeper insights.
Headed into production in 2019, the Intel® Nervana™ Neural Network Processor-i 1000 is a discrete accelerator designed specifically for the growing complexity and scale of inference applications. NNP-i 1000 is expected to deliver industry leading performance per watt on real production workloads and is built on Intel’s 10nm process technology with Ice Lake cores to support general operations as well as neural network acceleration.
Related Resources
10.17.17
As our Intel CEO Brian Krzanich discussed earlier today at Wall Street Journal’s D.Live event, Intel will soon be shipping…
05.18.16
Nervana is currently developing the Nervana Engine, an application specific integrated circuit (ASIC) that is custom-designed and optimized for deep…
05.16.18
The pace of AI developer community growth and advancement is incredible. Kaggle added more than 600,000 new users in 2017,…
04.12.18
Natural language processing (NLP) is one of the most familiar AI capabilities, having become ubiquitous through consumer digital assistants and…
10.12.18
In the early days of artificial intelligence (AI), Hans Moravec asserted what became known as Moravec’s paradox: "it is comparatively…
01.07.19
The annual CES event, held each January in Las Vegas, is one of the biggest tech shows of the year.…