Intel and Baidu jointly develop the Nervana neural network training processor
Published time: 2019-07-04
At the Baidu AI Developers Conference, Naveen Rao, Intel's vice president and general manager of the Artificial Intelligence Products Division, announced that Intel is working with Baidu to develop the Intel® NervanaTM Neural Network Training Processor (NNP-T). This collaboration includes a new custom accelerator to achieve the goal of speed training deep learning models.
Naveen Rao said: "In the next few years, the complexity of the AI model and the need for large-scale deep learning computing will explode. Intel and Baidu will continue their cooperation for more than a decade and focus on joint design and development of new hardware and Supporting software to continuously move towards the new frontier of 'AI 2.0'."
AI is not a single workload, but a powerful capability that enhances the performance of all applications, whether they run on mobile phones or in large data centers. However, mobile phones, data centers, and all facilities in between have different requirements for performance and power consumption, so a single AI hardware can't meet all the requirements. Intel provides superior hardware choices in artificial intelligence and maximizes the release of hardware through software, helping customers run AI applications wherever data is complex or wherever they are. The NNP-T is a new class of highly developed deep learning system hardware that accelerates large-scale distributed training. Close cooperation with Baidu ensures that Intel's development department is always keeping up with the latest customer demand for training hardware.
Since 2016, Intel has been optimizing the Baidu Paddle* (Deep Learning Framework) for Intel® Xeon® scalable processors. Today, by optimizing NNP-T for Baidu's flying paddles, both parties can provide data scientists with more hardware options.
At the same time, Intel is further enhancing the performance of these AI solutions with more technology. For example, with the higher memory performance offered by Intel's Proud Data Center-class persistent memory, Baidu is able to offer personalized mobile content to millions of users through its Feed Stream* service, and gains access through Baidu's AI recommendation engine. Efficient customer experience.
In addition, given the importance of data security to users, Intel is working with Baidu to create MesaTEE*, the Memory Security Feature as a Service (FaaS) computing framework based on Intel Software Protection Extensions (SGX) technology.
- 1.How Intel AGILEX FPGAs are compatible with CXL
- 2.Intel teamed up with HPE to provide a new programmable acceleration card for enhanced workload acceleration
- 3.Intel and Baidu jointly develop the Nervana neural network training processor
- 4.Intel is delaying plans to build a fab in Israel
- 5.Why are Intel and Dell Ian’s favored FPGAs?
- 6.Intel Unite? cloud services make it easier for organizations of all sizes to deploy and manage
- 7.Intel unveils details of a second generation of powerful scalable processors that will increase AI performance 30-fold
- 8. Intel chip circuit map released, 2021 release of the spoilers
- 9.The ninth-generation core i3 has been opened up to support the speed up technology that was once the i5
- 10.In the future, the graphics chip on the Intel graphics card may be made by samsung
CPU - Central Processing Units Atom E382 >
16M Flash Memory >
IC MPU 16-BIT 5V 50MHZ 100-SQFP >
HIGH-PERFORMANCE CHMOS MICROCONTROLLER >
RG82845GEES QD74 SECRETIntel
RCLXT16727FEA SL 79QIntel
- PC750L-GB450 A2
- PC28F640J3D75 S L8ZT
- NG82945GT SL8Z6
- LX915QC B3
- L012A175 SL3AG
- KC400/128 SL3GQ
- BD82019MBQV60 ES
- 7215B068 QK22