XGBoost Model Summary and Parameter Tuning

↑↑↑ Follow “Star Mark” Datawhale
Daily Insights & Monthly Study Groups, don’t miss out
Datawhale Insights
Author: Wang Maolin, Huazhong University of Science and Technology, Datawhale Member

Content Overview

XGBoost Model and Parameter Tuning Summary

  1. XGBoost Principles

  2. XGBoost Advantages Summary

  3. XGBoost Parameter Details

  4. XGBoost Quick Usage

  5. XGBoost Parameter Tuning Methods

PPT Download: Reply “210502” in the backend to obtain it

XGBoost Model Introduction

XGBoost Model Summary and Parameter Tuning

1.XGBoost Principles

XGBoost Model Summary and Parameter TuningXGBoost is a scalable machine learning system developed in 2016 by Professor Tianqi Chen from the University of Washington. Strictly speaking, XGBoost is not a model, but a package that allows users to easily solve classification, regression, or ranking problems. It implements the gradient boosting decision tree (GBDT) model internally and optimizes many algorithms within the model, achieving high accuracy while maintaining extremely fast speed..

2.XGBoost Advantages Summary

XGBoost Model Summary and Parameter Tuning

XGBoost Model Parameter Tuning

1.XGBoost Parameter Details
1. General Parameters

XGBoost Model Summary and Parameter Tuning

XGBoost Model Summary and Parameter Tuning

XGBoost Model Summary and Parameter Tuning

2. Learning Objective Parameters

XGBoost Model Summary and Parameter Tuning

3. Package Parameters

XGBoost Model Summary and Parameter Tuning

2.XGBoost Quick Usage

XGBoost Model Summary and Parameter Tuning

XGBoost Model Summary and Parameter Tuning

3.XGBoost Parameter Tuning Methods (Bayesian Optimization)

Hyperopt is a Python library for sklearn that performs serial and parallel optimization over a search space, which can include real-valued, discrete, and conditional dimensions.

1. Initialize the range of values needed for the space

XGBoost Model Summary and Parameter Tuning

2. Define the objective function

XGBoost Model Summary and Parameter Tuning

3. Run hyperopt function

XGBoost Model Summary and Parameter Tuning

About the Author

Wang Maolin, a key contributor to Datawhale, contributor to open-source content for Datawhale & Tianchi Data Mining Learning Competition, with over 100,000 reads.

Participated in over 30 competitions, winning the runner-up in the DCIC – Digital China Innovation and Entrepreneurship Competition, and multiple Top 10 finishes in the Global Urban Computing AI Challenge, Alibaba Cloud German AI Challenge, etc.

XGBoost Model Summary and Parameter Tuning
It’s not easy to organize,pleaselike and share

Leave a Comment