Part 1: Overview
Get highlights on performance optimizations and enhancements that are included in this Python* distribution.
Hi. My name is Sergey Maidanov. In this video, we'll be talking about Python* and how it can help accelerate technical computing and machine learning. I will also highlight some key features of Intel® Distribution for Python*. Stay with me to learn more.
Python is known as a popular and powerful language used across various application domains. Being an interpreted language, it has inherited performance constraints limiting its usage to environments not very demanding for performance. Python's low efficiency in production environments creates an organizational challenge when companies and institutions need to have two distinct things. The one that prototypes in numerical model in Python, and the other that it writes it in a different language to deploy it in production.
Our team's mission at Intel is to bring Python performance up when a prototyped numerical or machine learning model can be deployed in production without the need to rewrite it in a different programming language. Since our target customers [INAUDIBLE] with development productivity, we aim to build performance on Intel architecture out of the box with relatively small effort on the user side.
Let me briefly outline what Intel Python [sic] is and how it brings performance efficiency. We deliver prebuilt Python along with the most popular packages for numerical computing and data science, such as NumPy, SciPy, and scikit-learn*. All are linked with Intel's performance libraries such as MKL [sic] and DAAL [sic] for near-to-native code speeds. Intel Python [sic] is also accompanied with productivity tools such as Jupyter* Notebooks and [INAUDIBLE]. It also shipped with conda* and PIP* package managers that allow you to seamlessly install any other package available in the community.
For machine learning, our distribution comes with optimized deep software (Caffe* and Theano*) as well as classic machine learning libraries like scikit-learn and pyDAAL. We also package Cython and Numba* for tuning performance hotspots to native speeds. And for multinode performance, we ship MPI for [Python] accelerated with Intel® MPI [sic]. Python Distribution [sic] is available in a variety of options, so don't forget to follow the links below to access it.
Let me illustrate the out-of-the-box performance on the example of Black-Scholes formal application being run in [a] prototype environment on [an] Intel® Core™-based processor [sic] and in production on Intel® Xeon® and Xeon Phi™ servers [sic]. The bars show performance that we can attain with the stock NumPy (illustrated by the dark blue bars) and with NumPy shipped with Intel Python [sic] (represented by the light bars). You can see that Intel's NumPy [sic] delivers significantly better performance on [an] Intel Core-based system [sic].
But it scales on relatively small problem sizes shown on the horizontal axis as the total number of options to price. This is typical for [a] prototype environment. You build and test your model on [a] relatively small problem first, and then deploy in production to run it in full scale on powerful CPUs.
This graph shows how the same application scales in production on the Intel Xeon-based server [sic]. You can see that Intel Python [sic] delivers much better performance and scales really well to large problems. Next, this graph shows how the same application scales on [an] Intel Xeon Phi-based system [sic]. You can see that Intel Python [sic] delivers even better performance on these highly parallel workloads that scale well for large enough problems.
Besides Intel Python [sic] engineering, we work with all major Python vendors and the open source community to make these optimizations broadly accessible. And we encourage you to take advantage of Intel Python's [sic] exceptional performance in your own numerical and machine learning projects. Every option to get Python is free for academic and commercial use, so don't forget to follow the links to access it. And thanks for watching.
Product and Performance Information
Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.