当前位置:   article > 正文

Java机器学习库ML之十模型选择准则AIC和BIC_bic score

bic score

        学习任务所建立的模型多数是参数估计并采用似然函数作为目标函数,当训练数据足够多时,可以不断提高模型精度,但是以提高模型复杂度为代价的,同时也带来一个机器学习中非常普遍的问题——过拟合。模型选择问题是在模型复杂度与模型对数据集描述能力(即似然函数)之间寻求最佳平衡。

        对于过拟合问题,可加入模型复杂度的惩罚项来避免,这里通过ML库代码介绍两个模型选择方法:赤池信息准则(Akaike Information Criterion,AIC)和贝叶斯信息准则(Bayesian Information Criterion,BIC)。

1)AIC是衡量统计模型拟合优良性的一种标准,由日本统计学家赤池弘次在1974年提出,它建立在熵的概念上,提供了权衡估计模型复杂度和拟合数据优良性的标准。


通常情况下,AIC定义为:


这里写图片描述


其中k是模型参数个数,L是似然函数。从一组可供选择的模型中选择最佳模型时,通常选择AIC最小的模型。


当两个模型之间存在较大差异时,差异主要体现在似然函数项,当似然函数差异不显著时,上式第一项,即模型复杂度则起作用,从而参数个数少的模型是较好的选择。


一般而言,当模型复杂度提高(k增大)时,似然函数L也会增大,从而使AIC变小,但是k过大时,似然函数增速减缓,导致AIC增大,模型过于复杂容易造成过拟合现象。目标是选取AIC最小的模型,AIC不仅要提高模型拟合度(极大似然),而且引入了惩罚项,使模型参数尽可能少,有助于降低过拟合的可能性。


2)BIC(Bayesian InformationCriterion)贝叶斯信息准则与AIC相似,用于模型选择,1978年由Schwarz提出。训练模型时,增加参数数量,也就是增加模型复杂度,会增大似然函数,但是也会导致过拟合现象,针对该问题,AIC和BIC均引入了与模型参数个数相关的惩罚项,BIC的惩罚项比AIC的大,考虑了样本数量,样本数量过多时,可有效防止模型精度过高造成的模型复杂度过高。


这里写图片描述


其中,k为模型参数个数,n为样本数量,L为似然函数。kln(n)惩罚项在维数过大且训练样本数据相对较少的情况下,可以有效避免出现维度灾难现象。


ML库对聚类方法进行模型选择的示例代码如下:

  1. /**
  2. * This file is part of the Java Machine Learning Library
  3. *
  4. * The Java Machine Learning Library is free software; you can redistribute it and/or modify
  5. * it under the terms of the GNU General Public License as published by
  6. * the Free Software Foundation; either version 2 of the License, or
  7. * (at your option) any later version.
  8. *
  9. * The Java Machine Learning Library is distributed in the hope that it will be useful,
  10. * but WITHOUT ANY WARRANTY; without even the implied warranty of
  11. * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
  12. * GNU General Public License for more details.
  13. *
  14. * You should have received a copy of the GNU General Public License
  15. * along with the Java Machine Learning Library; if not, write to the Free Software
  16. * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
  17. *
  18. * Copyright (c) 2006-2012, Thomas Abeel
  19. *
  20. * Project: http://java-ml.sourceforge.net/
  21. *
  22. */
  23. package com.gddx;
  24. import java.io.File;
  25. import net.sf.javaml.clustering.Clusterer;
  26. import net.sf.javaml.clustering.KMeans;
  27. import net.sf.javaml.clustering.evaluation.AICScore;
  28. import net.sf.javaml.clustering.evaluation.BICScore;
  29. import net.sf.javaml.clustering.evaluation.ClusterEvaluation;
  30. import net.sf.javaml.clustering.evaluation.SumOfSquaredErrors;
  31. import net.sf.javaml.core.Dataset;
  32. import net.sf.javaml.tools.data.FileHandler;
  33. /**
  34. * Shows how to use the different cluster evaluation measure that are
  35. * implemented in Java-ML.
  36. *
  37. * @see net.sf.javaml.clustering.evaluation.*
  38. *
  39. * @author Thomas Abeel
  40. *
  41. */
  42. public class TutorialClusterEvaluation {
  43. public static void main(String[] args) throws Exception {
  44. /* Load a dataset */
  45. Dataset data = FileHandler.loadDataset(new File("D:\\tmp\\javaml-0.1.7-src\\UCI-small\\iris\\iris.data"), 4, ",");
  46. /*
  47. * Create a new instance of the KMeans algorithm that will create 3
  48. * clusters and create one that will make 4 clusters.
  49. */
  50. Clusterer km3 = new KMeans(3);
  51. Clusterer km4 = new KMeans(4);
  52. /*
  53. * Cluster the data, we will create 3 and 4 clusters.
  54. */
  55. Dataset[] clusters3 = km3.cluster(data);
  56. Dataset[] clusters4 = km4.cluster(data);
  57. ClusterEvaluation aic = new AICScore();
  58. ClusterEvaluation bic = new BICScore();
  59. ClusterEvaluation sse = new SumOfSquaredErrors();
  60. double aicScore3 = aic.score(clusters3);
  61. double bicScore3 = bic.score(clusters3);
  62. double sseScore3 = sse.score(clusters3);
  63. double aicScore4 = aic.score(clusters4);
  64. double bicScore4 = bic.score(clusters4);
  65. double sseScore4 = sse.score(clusters4);
  66. System.out.println("AIC score: " + aicScore3+"\t"+aicScore4);
  67. System.out.println("BIC score: " + bicScore3+"\t"+bicScore4);
  68. System.out.println("Sum of squared errors: " + sseScore3+"\t"+sseScore4);
  69. }
  70. }


声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/运维做开发/article/detail/908953
推荐阅读
相关标签
  

闽ICP备14008679号