Regression Tree Ensembles
A regression tree ensemble is a predictive model composed of a weighted
                    combination of multiple regression trees. In general, combining multiple
                    regression trees increases predictive performance. To boost regression trees
                    using LSBoost, use fitrensemble. To bag regression trees or
                    to grow a random
                    forest,
                    use fitrensemble or TreeBagger. To
                    implement quantile regression using a bag of regression trees, use
                        TreeBagger.
For classification ensembles, such as boosted or bagged classification trees, random subspace ensembles, or error-correcting output codes (ECOC) models for multiclass classification, see Classification Ensembles.
Apps
| Regression Learner | Train regression models to predict data using supervised machine learning | 
Blocks
| RegressionEnsemble Predict | Predict responses using ensemble of decision trees for regression (Since R2021a) | 
Functions
Objects
Topics
- Ensemble Algorithms
Learn about different algorithms for ensemble learning.
 - Framework for Ensemble Learning
Obtain highly accurate predictions by using many weak learners.
 - Train Regression Ensemble
Train a simple regression ensemble.
 - Test Ensemble Quality
Learn methods to evaluate the predictive quality of an ensemble.
 - Select Predictors for Random Forests
Select split-predictors for random forests using interaction test algorithm.
 - Ensemble Regularization
Automatically choose fewer weak learners for an ensemble in a way that does not diminish predictive performance.
 - Bootstrap Aggregation (Bagging) of Regression Trees Using TreeBagger
Create a
TreeBaggerensemble for regression. - Use Parallel Processing for Regression TreeBagger Workflow
Speed up computation by running
TreeBaggerin parallel. - Working with Quantile Regression Models
Estimate prediction intervals and create models that are robust to outliers by using quantile regression models.
 - Detect Outliers Using Quantile Regression
Detect outliers in data using quantile random forest.
 - Conditional Quantile Estimation Using Kernel Smoothing
Estimate conditional quantiles of a response given predictor data using quantile random forest and by estimating the conditional distribution function of the response using kernel smoothing.
 - Tune Random Forest Using Quantile Error and Bayesian Optimization
Tune quantile random forest using Bayesian optimization.
 - Predict Responses Using RegressionEnsemble Predict Block
Train a regression ensemble model with optimal hyperparameters, and then use the RegressionEnsemble Predict block for response prediction.
 - Manually Perform Time Series Forecasting Using Ensembles of Boosted Regression Trees
Manually perform single-step and multiple-step time series forecasting with ensembles of boosted regression trees.