Intel and Baidu jointly develop the Nervana neural network training processor
Published time: 2019-12-20 10:46:09
At the Baidu AI Developers Conference, Naveen Rao, Intel's vice president and general manager of the Artificial Intelligence Products Division, announced that Intel is working with Baidu to develop the Intel® NervanaTM Neural Network Training Processor (NNP-T). This collaboration includes a new custom accelerator to achieve the goal of speed training deep learning models.
Naveen Rao said: "In the next few years, the complexity of the AI model and the need for large-scale deep learning computing will explode. Intel and Baidu will continue their cooperation for more than a decade and focus on joint design and development of new hardware and Supporting software to continuously move towards the new frontier of 'AI 2.0'."
AI is not a single workload, but a powerful capability that enhances the performance of all applications, whether they run on mobile phones or in large data centers. However, mobile phones, data centers, and all facilities in between have different requirements for performance and power consumption, so a single AI hardware can't meet all the requirements. Intel provides superior hardware choices in artificial intelligence and maximizes the release of hardware through software, helping customers run AI applications wherever data is complex or wherever they are. The NNP-T is a new class of highly developed deep learning system hardware that accelerates large-scale distributed training. Close cooperation with Baidu ensures that Intel's development department is always keeping up with the latest customer demand for training hardware.
Since 2016, Intel has been optimizing the Baidu Paddle* (Deep Learning Framework) for Intel® Xeon® scalable processors. Today, by optimizing NNP-T for Baidu's flying paddles, both parties can provide data scientists with more hardware options.
At the same time, Intel is further enhancing the performance of these AI solutions with more technology. For example, with the higher memory performance offered by Intel's Proud Data Center-class persistent memory, Baidu is able to offer personalized mobile content to millions of users through its Feed Stream* service, and gains access through Baidu's AI recommendation engine. Efficient customer experience.
In addition, given the importance of data security to users, Intel is working with Baidu to create MesaTEE*, the Memory Security Feature as a Service (FaaS) computing framework based on Intel Software Protection Extensions (SGX) technology.Tag: Intel
FPGA Cyclone® V SX Family 25000 Cells 28 >
400GB SSD DC P3500 Series PCIe 3.0, MLC >
TRAY 1066MHZ 2M CACHEIntel
Pentium Processor E6700 3.20GHz LGA775 >
ADVANCED 16-BIT CHMOS MICROCONTROLLER WI >
16-BIT HIGH-INTEGRATION EMBEDDED PROCESS >
RJ80536 760 SL7SQIntel
- NQ82001MCH QG83ES
- NH82801IBM QP23
- NH80801GR SL8FY
- JQ82852GM SL7VP
- DF8064101211300S R0QB
- CLCE4100ES QMNP
- AF82UL11L QT54ES/Q