Inspur’s AI Inferencing and Open Source Project – Conversations in the Cloud – Episode 202

Published: May 21, 2020, 10:46 a.m.

In this Intel Conversations in the Cloud audio podcast: Vangel Bojaxhi, Global AI & HPC Director at Inspur, joins Conversations in the Cloud to discuss AI Inferencing applications and Inspur’s new solution. Vangel describes the collaboration between Intel and Inspur, delivering optimized AI solutions for customers, and Inspur’s open source projects for deep learning inference. As an Intel Select Solution, Inspur’s AI Inferencing is a fully optimized, tested, and ready-to-go configuration. The solution reduces the deployment time and cost for end-users while ensuring scalability.

Explore Inspur’s AI Inferencing and other solutions here:
inspursystems.com

Discover Open Source Deep Learning Inference Engine Based on FPGA at:
github.com/TF2-Engine/TF2

Learn about Intel Select Solutions and other performance optimized configurations at:
intel.com/selectsolutions