当前位置:   article > 正文

训练决策树回归模型_基于决策树的快速训练分段回归模型

基于决策树的快速训练分段回归模型

训练决策树回归模型

默认使用均方误差 mse


# 训练决策树回归模型      默认使用均方误差 mse
from sklearn.tree import DecisionTreeRegressor
from sklearn import datasets
​
boston = datasets.load_boston()
features = boston.data[:,0:2]
target = boston.target
​
decisiontree = DecisionTreeRegressor(random_state=0)
model = decisiontree.fit(features, target)
​
observation = [[0.02, 16]]
model.predict(observation)
array([33.])
Discussion
Decision tree regression works similarly to decision tree classification, however instead of reducing Gini impurity or entropy, potential splits are by default measure on how much they reduce mean squared error (MSE):
MSE=1n∑i=1n(yi−ŷ i)2
MSE=1n∑i=1n(yi−y^i)2
 
where  yiyi  is the true value of the target and  ŷ iy^i  is the predicted value.

We can use the criterion parameter to select the desired measurement of split quality. For example we can construct a tree whose splits reduce mean absolute error:

# 也可以用mae   绝对误差  来作为分裂标准
decisiontree_mae = DecisionTreeRegressor(criterion="mae", random_state=0)
model_mae = decisiontree_mae.fit(features, target)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/643588
推荐阅读
相关标签
  

闽ICP备14008679号