Accelerate XGBoost Gradient Boosting Training and Inference
Accelerate XGBoost Gradient Boosting Training and Inference
Subscribe Now
Stay in the know on all things CODE. Updates are delivered to your inbox.
Overview
Gradient boosting is a popular machine learning technique. It's used for regression and classification tasks in a myriad of real-world implementations, such as predicting the compressive strength of geological formations and the frequency and severity of infectious diseases.
Despite its popularity, training and inference for gradient boosting algorithms can become complex due to problems with handling large data sizes, irregular memory access, inefficient memory usage, and a multitude of other issues.
Enter XGBoost, a scalable, portable, and distributed machine learning package for gradient boosted decision trees. It significantly speeds up model training and improves accuracy for better predictions.
This session shows how XGBoost optimizations for Intel® architecture, coupled with the AI Tools, overcomes these hurdles. The video includes a live demonstration.
Get the Software
Download XGBoost optimized for Intel architecture as part of the AI Tools—eight tools and frameworks to accelerate end-to-end data science and analytics pipelines.
Accelerate data science and AI pipelines-from preprocessing through machine learning-and provide interoperability for efficient model development.
You May Also Like
Related Webinars