Developer Kits with Intel® Xeon® D-2100 Processor Product Family
Preinstalled Software
Get the most out of your hardware performance with Open Network Edge Services Software (OpenNESS), Intel® Distribution of OpenVINO* toolkit, CentOS*, and libraries.
Pretrained Models for Acceleration
Choose from a variety of optimized detection and recognition models to develop deep learning applications.
Training Extensions for Deep Learning
Modify, customize, train, and extend computer vision models for deep learning and inference optimization.
Overview
Develop and deploy solutions on Intel's transformative and groundbreaking data center processor architecture for edge computing demands.
- Accelerate network security, routing, and real-time data compression using an Intel® Xeon® D processor in a low-power SoC.
- Use preinstalled, configured, and validated software.
- Get set up quickly using preloaded samples.
- Accelerate workloads on CPUs using the Intel Distribution of OpenVINO toolkit.
- Manage networking applications and services with the optimized and cohesive OpenNESS framework.
- Converge IoT edge and network service capabilities to merge network workloads with inference, analytics, media, and IoT applications on a common infrastructure, thereby delivering ultimate time-to-business outcomes while reducing the total cost of ownership.
Who Needs This Product
System integrators, independent software vendors (ISVs), and IoT developers who build vision-based inferencing applications for edge computing that support:
- Wireless communications networks
- Enterprise IT
- Cloud service providers
Reference Implementations
These solutions have been prebuilt and validated for developers to test and deploy industrial, retail, and smart city applications enabled for computer vision.
IEI* PUZZLE Developer Kit
Included:
Software
The following software is preinstalled on the development kits:
CentOS
This open source, multi-access edge computing (MEC) software toolkit enables highly optimized and performance edge platforms to onboard and manage applications and network functions with cloud-like agility across any type of network.
- Enable convolutional neural network-based deep learning inference on the edge.
- Support heterogeneous execution across various accelerators—CPU, GPU, Intel® Movidius™ Neural Compute Stick (NCS), and Intel Vision Accelerator Design products—using a common API.
- Speed up time to market via a library of functions and preoptimized kernels.
- Download the latest software packages for edge solutions (including computer vision and deep learning applications) for Intel architecture.
- Develop, test, deploy, and maintain solutions at the edge with software packages and tools.
- Optimize your computer vision and deep learning applications for Intel architecture with the Intel Distribution of OpenVINO toolkit.
- Maintain and manage your applications with containerized architecture and regular updates.
- Get started quickly with reference implementations, tutorials, and samples.
Get Help
Your success is our success. Access these support resources when you need assistance.
Support for IEI* Puzzle AIoT Developer Kits
Supermicro Services and Support