Ecosystem Developer Hub
Explore how developers from leading software vendors, system integrators, enterprise users, cloud service providers (CSP), original equipment manufacturers (OEM), original design manufacturers (ODM), and more across multiple industries have used Intel® software, tools, and framework optimizations in real-world scenarios. Learn about their firsthand experiences and solutions, created in collaboration with Intel, which are designed to streamline your development process, enhance performance, and help you drive seamless digital transformation from the edge to the cloud.
Made by developers and for developers.
AI & Machine Learning Ecosystem
Intel is your ally in securely and responsibly bringing AI everywhere. Top software developers across various industries use the Intel® AI Portfolio and machine learning software to enhance compute performance and streamline productivity, enabling developers to quickly build scalable platforms, systems, and applications.
HPC & Developer Tools Ecosystem
Industry-leading software developers use Intel® tools and framework optimizations to build software platforms, systems, and applications, making it easier for developers to speed up their software development process from the edge to the cloud.
Ecosystem Developers: Video Testimonials
Industry experts share their experiences using Intel tools and frameworks to develop game-changing strategies.
Ecosystem Developers: AI Model Cards and Use Cases
Multi-use
PyG is a library built on PyTorch* to more easily develop graph neural network (GNN) models for structured data applications, including distributed training solutions on the latest Intel® Xeon® platforms for large-scale datasets.
Optimization
This open source library integrates OpenVINO™ toolkit, Intel® Neural Compressor, and Intel® Extension for PyTorch* to optimize and accelerate Hugging Face* models on Intel platforms.
Chatbot
Dolly is a large language model (LLM) chatbot trained on the Databricks machine learning platform in capability domains from InstructGPT that include brainstorming, classification, generation, and summarization.
Detection
This machine learning benchmark model measures the energy consumption of machine learning pipelines that use AI acceleration from Intel and optimized scikit-learn* algorithms.
Multi-use
Accelerate model training, evaluation, and deployment pipelines—including complex models and simulations such as fluid dynamics workloads—on the Oracle Cloud* infrastructure built on Intel Xeon Scalable processors.
NLP
Use the Watson* natural language processing (NLP) library to create and embed models for text-processing applications, optimized by Intel® oneAPI Deep Neural Network Library and Intel® Extension for TensorFlow*.
Chatbot
Prediction Guard uses Hugging Face Neural-Chat-7B, a revolutionary AI model for performing digital conversations. The model is fine-tuned on Intel® Gaudi® 2 AI accelerators using Intel® Extension for Transformers*.
Optimization
Vectara* uses a hallucination evaluation model in the LLM Hallucination Benchmark Leaderboard to compare the performance and accuracy of popular LLMs such as Neural-Chat-7B.
Stay In the Know on All Things CODE
Sign up to receive the latest tech articles, tutorials, developer tools, training opportunities, product updates, and more, hand-curated to help you optimize your code, no matter where you are in your developer journey. Take a chance and subscribe. You can change your mind at any time.