1, using Xgboost for feature set
1) xgbmodel.apply (self, X, ntree_limit=0)
Return the predicted leaf every tree for each Sampl E
X: Training set features, features matrix
Ntree_limit: The number of predicted hours, limit numbers of trees in the prediction; defaults to 0 Trees).
def apply (self, X, ntree_limit=0): "" "Return to the
predicted leaf every tree for each sample.
Parameters
----------
x:array_like, Shape=[n_samples, N_features]
Input features matrix.
Ntree_limit:int Limit number of trees in the
prediction, defaults to 0 (with all trees).
Returns
-------
x_leaves:array_like, Shape=[n_samples, n_trees] for each datapoint x in X and for each
tre E, return the index of the
leaf x ends up in. Leaves is numbered within
' [0; 2** (self.max_depth+1)) ', possibly with gaps in the numbering.
"" Test_dmatrix = Dmatrix (X, missing=self.missing)
return Self.get_booster (). Predict (Test_dmatrix,
pred_ Leaf=true,
Ntree_limit=ntree_limit)
The difference between GBDT and GBDT+LR
I understand the following:
GBDT: Fits the residuals of the actual result after the last prediction (i.e. fitting [(y-y1^)-y^2]).
GBDT+LR: The results of each lesson tree prediction will be GBDT, and the weights are automatically learned by combining the linear again.
Novice learning, notes to facilitate their own later learning to understand, while learning to modify, if there is an incorrect place, please correct me.