当前位置:   article > 正文

使用Xgboost时出现错误 TypeError: 'str' object is not callable_raise typeerror('not supported type for data.' + s

raise typeerror('not supported type for data.' + str(type(data))) xgboost
  1. def modelfit(alg, dtrain, dtest, predictors,useTrainCV=True, cv_folds=3, early_stopping_rounds=50):
  2. if useTrainCV:
  3. xgb_param = alg.get_xgb_params()
  4. xgtrain = xgb.DMatrix(dtrain[predictors], label=dtrain[target])
  5. xgtest = xgb.DMatrix(dtest[predictors].values)
  6. cvresult = xgb.cv(xgb_param, xgtrain, num_boost_round=alg.get_params()['n_estimators'], nfold=cv_folds,
  7. metrics='auc', early_stopping_rounds=early_stopping_rounds)
  8. print (cvresult.shape[0])
  9. alg.set_params(n_estimators=cvresult.shape[0])
  10. #Fit the algorithm on the data
  11. alg.fit(dtrain[predictors], dtrain['Disbursed'],eval_metric='auc')
  12. #Predict training set:
  13. dtrain_predictions = alg.predict(dtrain[predictors])
  14. dtrain_predprob = alg.predict_proba(dtrain[predictors])[:,1]
  15. #Print model report:
  16. print ("\nModel Report")
  17. print ("Accuracy : %.4g" % metrics.accuracy_score(dtrain['Disbursed'].values, dtrain_predictions))
  18. print ("AUC Score (Train): %f" % metrics.roc_auc_score(dtrain['Disbursed'], dtrain_predprob))
  19. feat_imp = pd.Series(alg.get_booster().get_fscore()).sort_values(ascending=False)#booster所有弱分类器、准确率+召回率+F值、排序
  20. feat_imp.plot(kind='bar', title='Feature Importances')
  21. plt.ylabel('Feature Importance Score')

解决方法:

将代码中的 alg.booster() 改成alg.get_booster() 即可。

转载自http://www.cnblogs.com/xiaodongsuibi/p/9303959.html

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Monodyee/article/detail/669705
推荐阅读
相关标签
  

闽ICP备14008679号