Content Overview
XGBoost Model and Parameter Tuning Summary
-
XGBoost Principles
-
XGBoost Advantages Summary
-
XGBoost Parameter Details
-
XGBoost Quick Usage
-
XGBoost Parameter Tuning Methods
PPT Download: Reply “210502” in the backend to obtain it
XGBoost Model Introduction
XGBoost is a scalable machine learning system developed in 2016 by Professor Tianqi Chen from the University of Washington. Strictly speaking, XGBoost is not a model, but a package that allows users to easily solve classification, regression, or ranking problems. It implements the gradient boosting decision tree (GBDT) model internally and optimizes many algorithms within the model, achieving high accuracy while maintaining extremely fast speed..
XGBoost Model Parameter Tuning
2. Learning Objective Parameters
3. Package Parameters
Hyperopt is a Python library for sklearn that performs serial and parallel optimization over a search space, which can include real-valued, discrete, and conditional dimensions.
1. Initialize the range of values needed for the space
2. Define the objective function
3. Run hyperopt function
About the Author
Wang Maolin, a key contributor to Datawhale, contributor to open-source content for Datawhale & Tianchi Data Mining Learning Competition, with over 100,000 reads.
Participated in over 30 competitions, winning the runner-up in the DCIC – Digital China Innovation and Entrepreneurship Competition, and multiple Top 10 finishes in the Global Urban Computing AI Challenge, Alibaba Cloud German AI Challenge, etc.
