当前位置:   article > 正文

多分类实现方式介绍和在Spark上实现多分类逻辑回归(Multinomial Logistic Regression)_spark多分类

spark多分类

背景

在之前的文章中介绍了多分类逻辑回归算法的数据原理,参考文章链接

该篇文章介绍一下Spark中多分类算法,主要包括的技术点如下

  • 多分类实现方式
    • 一对一 (One V One)
    • 一对其余(One V Remaining)
    • 多对多 (More V More)
  • Spark中的多分类实现

多分类实现方式

一对一

假设某个分类中有N个类别,将这N个类别两两配对(继而转化为二分类问题),这样可以得到 N(N-1)/ 2个二分类器,这样训练模型时需要训练 N(N-1)/ 2个模型,预测时将样本输送到这些模型中,最终统计出现次数较多的类别结果作为最终类别。

假设现在有三个类别:类别A,类别B,类别C,类别D。一对一实现多分类如下图所示:

一对多

一对多,即每次把一个类别看做是正类,其余看做负类,此时假设有N个类别,则对应N个分类器,预测时

  • 若只有一个分类器将样本预测为正类,则结果为正类
  • 若只有一个分类器将样本预测为负类,则结果为负类
  • 若预测结果有正类或者负类个数不唯一,则根据概率最大对应的结果作为最终结果

一对多实现多分类如下图所示:

 

多对多

将多个类别作为正类,将多个类别作为负类。显然正反类构造必须有特殊的设计,不能随意选取,在周志华老师的西瓜书中提到了“纠错输出码技术(EOOC)”

EOOC工作主要分为两步:

  • 编码:对N个类别分别做M次划分,每次划分将一部分类别划为正类,一部分划分为负类,从而形成一个二分类分类器,这样一共产生M个训练集,训练出M个分类器
  • 解码:M个分类器分别对测试样本进行预测,这些预测标记组成一个编码,将这个预测编码分别与每个类别各自的编码进行比较,返回其中距离最小的类别作为最终预测结果

 

上图(a)中,C1类别经过5个分类器后得到的编码为[-1,+1,-1,+1,+1],测试示例经过5个分类器后的编码为[-1,-1,+1,-1,+1],两个编码对比,有三个对应位置不一样,所以海明距离为3,同理可求得测试样例与其他类别的海明距离和欧式距离。

上图(b)中,比图(a)多了0类,即停用类。在计算海明距离时,停用类和测试示例的距离为0.5(笔者认为这里的参数可以进行动态的调整),欧式距离就是正常的当做0值操作。

转载请注明出处:http://blog.csdn.net/gamer_gyt 
博主微博:http://weibo.com/234654758 
Github:https://github.com/thinkgamer 

个人网站:http://thinkgamer.github.io

OneVsRest介绍

OneVsRest将一个给定的二分类算法有效地扩展到多分类问题应用中,也叫做”One-vs-All”算法。OneVsRest是一个Estimator(评估器)。它采用一个基础的Classifier然后对于k个类别分别创建二分类问题。类别i的二分类分类器用来预测类别为i还是不为i,即将i类和其他类别区分开来。最后,通过依次对k个二分类分类器进行评估,取置信最高的分类器的标签作为i类别的标签。

对应多 “分类实现方式” 中的一对多

Spark中的多分类实现

基于ml包中的LogisticRegression实现

  1. package classifiy
  2. import org.apache.spark.sql.SparkSession
  3. import org.apache.spark.ml.classification.{LogisticRegression, OneVsRest}
  4. import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator
  5. /**
  6. * @Program: ranking
  7. * @Description: 多分类逻辑回归spark实现
  8. * @Author: Thinkgamer
  9. * @Create: 2019-01-07 17:48
  10. **/
  11. object MultiClassLR {
  12. def main(args: Array[String]): Unit = {
  13. val input_data = "data/sample_multiclass_classification_data.txt" // args(0)
  14. val spark = SparkSession.builder.master("local[5]").appName("MulticlassLRWithElasticNetExample")
  15. .getOrCreate()
  16. // runBaseLR(spark,input_data)
  17. runBaseOneVsRest(spark,input_data)
  18. spark.stop()
  19. }
  20. /**
  21. * @Author: Thinkgamer
  22. * @Param: spark: SparkSession, input_data: String
  23. * @Return: Unit
  24. * @Date: 2019/1/11 18:26
  25. * @Desc: 基于Spark ml包中的LR进行训练和测试
  26. **/
  27. def runBaseLR(spark: SparkSession, input_data: String): Unit = {
  28. // 加载训练数据集
  29. val split = spark.read.format("libsvm").load(input_data).randomSplit(Array(1,1))
  30. val train_data = split(0)
  31. val test_data = split(1)
  32. // 创建模型
  33. val lr = new LogisticRegression().setMaxIter(20).setRegParam(0.3).setElasticNetParam(0.8)
  34. // 训练模型
  35. val model = lr.fit(train_data)
  36. // 系数矩阵、截距向量
  37. println(s"coefficientMatrix is: \n ${model.coefficientMatrix}")
  38. println(s"interceptVector is: \n ${model.interceptVector}")
  39. // 测试集计算
  40. val predictions = model.transform(test_data)
  41. val test_count = test_data.count().toInt
  42. predictions.take(test_count).foreach(println)
  43. val evaluator = new MulticlassClassificationEvaluator()//.setLabelCol("label").setPredictionCol("prediction")
  44. val accuracy =evaluator.setMetricName("accuracy").evaluate(predictions);
  45. val weightedPrecision=evaluator.setMetricName("weightedPrecision").evaluate(predictions);
  46. val weightedRecall=evaluator.setMetricName("weightedRecall").evaluate(predictions);
  47. val f1=evaluator.setMetricName("f1").evaluate(predictions);
  48. println(s"accuracy is $accuracy")
  49. println(s"weightedPrecision is $weightedPrecision")
  50. println(s"weightedRecall is $weightedRecall")
  51. println(s"f1 is $f1")
  52. }
  53. /**
  54. * @Author: Thinkgamer
  55. * @Param: spark: SparkSession, input_data: String
  56. * @Return: Unit
  57. * @Date: 2019/1/11 18:39
  58. * @Desc: 基于Spark ml包中的LR 和 OneVsRest进行训练和测试
  59. **/
  60. def runBaseOneVsRest(spark: SparkSession, input_data: String): Unit = {
  61. // 加载训练数据集
  62. val split = spark.read.format("libsvm").load(input_data).randomSplit(Array(1,1))
  63. val train_data = split(0)
  64. val test_data = split(1)
  65. // 创建模型
  66. val lr = new LogisticRegression().setMaxIter(10).setRegParam(0.3).setElasticNetParam(0.8)
  67. // 训练模型
  68. val model = new OneVsRest().setClassifier(lr).fit(train_data)
  69. // 测试集计算
  70. val predictions = model.transform(test_data)
  71. val test_count = test_data.count().toInt
  72. predictions.take(test_count).foreach(println)
  73. val evaluator = new MulticlassClassificationEvaluator()//.setLabelCol("label").setPredictionCol("prediction")
  74. val accuracy =evaluator.setMetricName("accuracy").evaluate(predictions);
  75. val weightedPrecision=evaluator.setMetricName("weightedPrecision").evaluate(predictions);
  76. val weightedRecall=evaluator.setMetricName("weightedRecall").evaluate(predictions);
  77. val f1=evaluator.setMetricName("f1").evaluate(predictions);
  78. println(s"accuracy is $accuracy")
  79. println(s"weightedPrecision is $weightedPrecision")
  80. println(s"weightedRecall is $weightedRecall")
  81. println(s"f1 is $f1")
  82. }
  83. }

运行输出信息(runBaseLR)

  1. ....
  2. [0.0,(4,[0,1,2,3],[-0.666667,-0.583333,0.186441,0.333333]),[0.142419333934195,-0.3772619583435227,0.06140018515891296],[0.3973163522554911,0.23628803028122905,0.3663956174632798],0.0]
  3. [0.0,(4,[0,1,2,3],[-0.277778,-0.333333,0.322034,0.583333]),[0.21570018327413792,-0.5776524462730686,0.06140018515891296],[0.433024032357093,0.19586792869116354,0.3711080389517435],0.0]
  4. [0.0,(4,[0,1,2,3],[-0.222222,-0.583333,0.355932,0.583333]),[0.21570018327413792,-0.6014274236583748,0.06140018515891296],[0.43502594980545906,0.19215033820182928,0.3728237119927117],0.0]
  5. [0.0,(4,[0,1,2,3],[-0.166667,-0.416667,0.38983,0.5]),[0.19127333120195608,-0.5901059157567916,0.06140018515891296],[0.42808566999671693,0.19596657170058288,0.37594775830270016],0.0]
  6. [0.0,(4,[0,1,2,3],[-0.111111,-0.166667,0.38983,0.416667]),[0.16684647912977424,-0.5550094304699024,0.06140018515891296],[0.4191514438648996,0.20364461396227818,0.3772039421728222],0.0]
  7. [0.0,(4,[0,1,2,3],[-0.0555556,-0.833333,0.355932,0.166667]),[0.09356562978983132,-0.42594457606442027,0.06140018515891296],[0.39014369479647476,0.23206207803368195,0.37779422716984323],0.0]
  8. [0.0,(4,[0,1,2,3],[-0.0555556,-0.166667,0.288136,0.416667]),[0.16684647912977424,-0.48368449831398386,0.06140018515891296],[0.41293451997504027,0.2154562894240759,0.371609190600884],0.0]
  9. [0.0,(4,[0,1,2,3],[-1.32455E-7,-0.5,0.559322,0.0833333]),[0.06913857253127131,-0.5334990630140029,0.06140018515891296],[0.3937538891012765,0.21552748276152253,0.39071862813720093],0.0]
  10. [0.0,(4,[0,1,2,3],[-1.32455E-7,-0.166667,0.322034,0.416667]),[0.16684647912977424,-0.50745947569929,0.06140018515891296],[0.41503545219767224,0.21146468038709856,0.37349986741522917],0.0]
  11. [0.0,(4,[0,1,2,3],[0.166667,-0.416667,0.457627,0.5]),[0.19127333120195608,-0.6376565718955491,0.06140018515891296],[0.43201713946035775,0.18858245891042616,0.3794004016292162],0.0]
  12. [0.0,(4,[0,1,2,3],[0.166667,-0.333333,0.559322,0.666667]),[0.24012732846971716,-0.7791755971528995,0.06140018515891296],[0.4551287523317569,0.1642315669435602,0.3806396807246828],0.0]
  13. [0.0,(4,[0,1,2,3],[0.166667,-0.333333,0.559322,0.75]),[0.264554180541899,-0.8142720824397889,0.06140018515891296],[0.46379029108175673,0.15768608110328425,0.3785236278149591],0.0]
  14. [0.0,(4,[0,1,2,3],[0.166667,-0.0833334,0.525424,0.416667]),[0.16684647912977424,-0.6501107427474174,0.06140018515891296],[0.4270407983033828,0.18865544956500177,0.38430375213161533],0.0]
  15. [0.0,(4,[0,1,2,3],[0.333333,-0.166667,0.423729,0.833333]),[0.2889810326140808,-0.7542679568173082,0.06140018515891296],[0.4653833436900921,0.16395835646095738,0.3706582998489505],0.0]
  16. [0.0,(4,[0,1,2,3],[0.333333,-0.0833334,0.559322,0.916667]),[0.31340817780966,-0.8844654741730755,0.06140018515891296],[0.4809833925879149,0.1451777843996671,0.37383882301241805],0.0]
  17. [0.0,(4,[0,1,2,3],[0.333333,0.0833333,0.59322,0.666667]),[0.24012732846971716,-0.8029505745382057,0.06140018515891296],[0.4568916924800109,0.16099421989016086,0.3821140876298281],0.0]
  18. [0.0,(4,[0,1,2,3],[0.333333,0.0833333,0.59322,1.0]),[0.3378350298818419,-0.9433369368452709,0.06140018515891296],[0.49111217656402084,0.13638756839822455,0.3725002550377546],0.0]
  19. [0.0,(4,[0,1,2,3],[0.833333,-0.166667,0.898305,0.666667]),[0.24012732846971716,-1.0169274751103967,0.06140018515891296],[0.4715146801235871,0.13414152950999628,0.3943437903664167],0.0]
  20. [0.0,(4,[0,1,2,3],[0.888889,-0.5,1.0,0.833333]),[0.2889810326140808,-1.1584460792082392,0.06140018515891296],[0.4922149764363623,0.11575645442862792,0.3920285691350097],0.0]
  21. [0.0,(4,[0,2,3],[0.166667,0.457627,0.833333]),[0.2889810326140808,-0.7780429342026143,0.06140018515891296],[0.4671829945263742,0.16072535926979975,0.3720916462038262],0.0]
  22. [0.0,(4,[0,2,3],[0.388889,0.661017,0.833333]),[0.2889810326140808,-0.9206942012507418,0.06140018515891296],[0.4773834422215063,0.1424006831259297,0.38021587465256407],0.0]
  23. [0.0,(4,[0,2,3],[0.444444,0.59322,0.833333]),[0.2889810326140808,-0.8731435451119842,0.06140018515891296],[0.47409567554725224,0.14830701795974927,0.3775973064929985],0.0]
  24. [1.0,(4,[0,1,2,3],[-0.944444,-0.166667,-0.898305,-0.916667]),[-0.22398491276551954,0.9099937167686063,0.06140018515891296],[0.18388194844652703,0.5715046368698148,0.2446134146836582],1.0]
  25. [1.0,(4,[0,1,2,3],[-0.888889,-0.75,-0.898305,-0.833333]),[-0.19955776756994034,0.874896810322209,0.06140018515891296],[0.19133013026597376,0.5602902452966907,0.24837962443733552],1.0]
  26. [1.0,(4,[0,1,2,3],[-0.833333,-0.0833334,-0.830508,-0.916667]),[-0.22398491276551954,0.8624430606298488,0.06140018515891296],[0.1888951183970725,0.5598225761969752,0.2512823054059524],1.0]
  27. [1.0,(4,[0,1,2,3],[-0.833333,0.333333,-1.0,-0.916667]),[-0.22398491276551954,0.98131935029267,0.06140018515891296],[0.17642755961698536,0.5888754069503642,0.23469703343265047],1.0]
  28. [1.0,(4,[0,1,2,3],[-0.722222,-0.166667,-0.864407,-0.833333]),[-0.19955776756994034,0.8511218329369028,0.06140018515891296],[0.1938823556238831,0.5544247896859419,0.2516928546901751],1.0]
  29. [1.0,(4,[0,1,2,3],[-0.722222,-0.0833334,-0.79661,-0.916667]),[-0.22398491276551954,0.8386680832445428,0.06140018515891296],[0.1914127309671249,0.5539558485407831,0.2546314204920921],1.0]
  30. [1.0,(4,[0,1,2,3],[-0.666667,-0.166667,-0.864407,-0.916667]),[-0.22398491276551954,0.8862187393833003,0.06140018515891296],[0.18638458287433546,0.5656728123922876,0.2479426047333771],1.0]
  31. [1.0,(4,[0,1,2,3],[-0.666667,-0.0833334,-0.830508,-1.0]),[-0.24841176483770142,0.8975395459167381,0.06140018515891296],[0.18153429638527077,0.5710038244297028,0.2474618791850266],1.0]
  32. [1.0,(4,[0,1,2,3],[-0.666667,-0.0833334,-0.830508,-1.0]),[-0.24841176483770142,0.8975395459167381,0.06140018515891296],[0.18153429638527077,0.5710038244297028,0.2474618791850266],1.0]
  33. [1.0,(4,[0,1,2,3],[-0.611111,0.25,-0.898305,-0.833333]),[-0.19955776756994034,0.874896810322209,0.06140018515891296],[0.19133013026597376,0.5602902452966907,0.24837962443733552],1.0]
  34. [1.0,(4,[0,1,2,3],[-0.611111,0.25,-0.79661,-0.583333]),[-0.12627691822999743,0.6982812997779695,0.06140018515891296],[0.2228503238816955,0.5082932631932551,0.26885641292504936],1.0]
  35. [1.0,(4,[0,1,2,3],[-0.555556,0.0833333,-0.762712,-0.666667]),[-0.15070406342557666,0.7096032288390605,0.06140018515891296],[0.21738356621323757,0.5138701832055212,0.26874625058124135],1.0]
  36. [1.0,(4,[0,1,2,3],[-0.555556,0.416667,-0.830508,-0.75]),[-0.1751309154977585,0.7922496688965621,0.06140018515891296],[0.20416952624793167,0.5371789311660708,0.25865154258599743],1.0]
  37. [1.0,(4,[0,1,2,3],[-0.555556,0.5,-0.830508,-0.833333]),[-0.19955776756994034,0.8273461541834515,0.06140018515891296],[0.19644126969897416,0.5485439622454286,0.2550147680555974],1.0]
  38. [1.0,(4,[0,1,2,3],[-0.555556,0.5,-0.79661,-0.916667]),[-0.22398491276551954,0.8386680832445428,0.06140018515891296],[0.1914127309671249,0.5539558485407831,0.2546314204920921],1.0]
  39. [1.0,(4,[0,1,2,3],[-0.555556,0.5,-0.694915,-0.75]),[-0.1751309154977585,0.6971490579871922,0.06140018515891296],[0.2146288394002106,0.5134692689977406,0.27190189160204886],1.0]
  40. [1.0,(4,[0,1,2,3],[-0.388889,0.583333,-0.898305,-0.75]),[-0.1751309154977585,0.8398003250353197,0.06140018515891296],[0.19896456348841018,0.5489777851472337,0.25205765136435615],1.0]
  41. [1.0,(4,[0,1,2,3],[-0.388889,0.583333,-0.762712,-0.75]),[-0.1751309154977585,0.7446997141259497,0.06140018515891296],[0.20939284449512424,0.5253384681035773,0.2652686874012984],1.0]
  42. [1.0,(4,[0,1,2,3],[-0.222222,1.0,-0.830508,-0.75]),[-0.1751309154977585,0.7922496688965621,0.06140018515891296],[0.20416952624793167,0.5371789311660708,0.25865154258599743],1.0]
  43. [1.0,(4,[0,1,2,3],[-0.166667,0.666667,-0.932203,-0.916667]),[-0.22398491276551954,0.9337686941539125,0.06140018515891296],[0.18138780824104053,0.577316667480086,0.24129552427887344],1.0]
  44. [1.0,(4,[0,2,3],[-0.944444,-0.898305,-0.916667]),[-0.22398491276551954,0.9099937167686063,0.06140018515891296],[0.18388194844652703,0.5715046368698148,0.2446134146836582],1.0]
  45. [2.0,(4,[0,1,2,3],[-0.5,-0.416667,-0.0169491,0.0833333]),[0.06913857253127131,-0.1293208704862573,0.06140018515891296],[0.35558280067351816,0.2915754168987629,0.35284178242771896],0.0]
  46. [2.0,(4,[0,1,2,3],[-0.388889,-0.166667,0.186441,0.166667]),[0.09356562978983132,-0.3070689877697442,0.06140018515891296],[0.37904040796475613,0.25391719226536036,0.36704239976988356],0.0]
  47. [2.0,(4,[0,1,2,3],[-0.333333,-0.666667,-0.0508475,-0.166667]),[-0.004142364745690817,-2.5560918566472357E-4,0.06140018515891296],[0.3255597810403734,0.32682761461296417,0.3476126043466625],2.0]
  48. [2.0,(4,[0,1,2,3],[-0.333333,-0.5,0.152542,-0.0833333]),[0.020284692512869188,-0.1780033056482645,0.06140018515891296],[0.34939306186517066,0.2865489973806377,0.36405794075419173],2.0]
  49. [2.0,(4,[0,1,2,3],[-0.277778,-0.583333,-0.0169491,-0.166667]),[-0.004142364745690817,-0.024030867118228966,0.06140018515891296],[0.32807902335251327,0.3216184811284419,0.35030249551904485],2.0]
  50. [2.0,(4,[0,1,2,3],[-0.277778,-0.416667,0.0847457,-4.03573E-8]),[0.044711620692401366,-0.16554973510508966,0.06140018515891296],[0.35370672025255956,0.2866341718375481,0.3596591079098922],2.0]
  51. [2.0,(4,[0,1,2,3],[-0.277778,-0.25,-0.118644,-4.03573E-8]),[0.044711620692401366,-0.022898678467405786,0.06140018515891296],[0.338816143919598,0.31666591200350974,0.3445179440768922],2.0]
  52. [2.0,(4,[0,1,2,3],[-0.277778,-0.166667,0.0508474,-4.03573E-8]),[0.044711620692401366,-0.14177454730933994,0.06140018515891296],[0.3512841109953878,0.29152015946544035,0.3571957295391717],2.0]
  53. [2.0,(4,[0,1,2,3],[-0.277778,-0.166667,0.186441,0.166667]),[0.09356562978983132,-0.3070689877697442,0.06140018515891296],[0.37904040796475613,0.25391719226536036,0.36704239976988356],0.0]
  54. [2.0,(4,[0,1,2,3],[-0.222222,-0.333333,0.0508474,-4.03573E-8]),[0.044711620692401366,-0.14177454730933994,0.06140018515891296],[0.3512841109953878,0.29152015946544035,0.3571957295391717],2.0]
  55. [2.0,(4,[0,1,2,3],[-0.222222,-0.333333,0.186441,-4.03573E-8]),[0.044711620692401366,-0.23687557903959694,0.06140018515891296],[0.36082667720169664,0.27227443905702464,0.3668988837412787],2.0]
  56. [2.0,(4,[0,1,2,3],[-0.222222,-0.166667,0.0847457,-0.0833333]),[0.020284692512869188,-0.13045314046720857,0.06140018515891296],[0.34458452452119065,0.2963678981077479,0.3590475773710615],2.0]
  57. [2.0,(4,[0,1,2,3],[-0.166667,-0.416667,0.0508474,-0.25]),[-0.028569216817872667,-0.03648468728602465,0.06140018515891296],[0.3240171302662436,0.3214625061047474,0.35452036362900907],2.0]
  58. [2.0,(4,[0,1,2,3],[-0.111111,-0.166667,0.0847457,0.166667]),[0.09356562978983132,-0.23574314383523692,0.06140018515891296],[0.3720560314882243,0.26766486428762953,0.36027910422414605],0.0]
  59. [2.0,(4,[0,1,2,3],[-1.32455E-7,-0.333333,0.0169491,-4.03573E-8]),[0.044711620692401366,-0.11799935951359021,0.06140018515891296],[0.34883736720915237,0.2964548223017103,0.35470781048913735],2.0]
  60. [2.0,(4,[0,1,2,3],[0.0555554,-0.25,0.118644,-4.03573E-8]),[0.044711620692401366,-0.1893249229008394,0.06140018515891296],[0.356104875824879,0.2817975031005548,0.3620976210745662],2.0]
  61. [2.0,(4,[0,1,2,3],[0.111111,-0.583333,0.322034,0.166667]),[0.09356562978983132,-0.40216959867911406,0.06140018515891296],[0.38797746914456277,0.23632596041298645,0.3756965704424508],0.0]
  62. [2.0,(4,[0,1,2,3],[0.277778,-0.25,0.220339,-4.03573E-8]),[0.044711620692401366,-0.2606505564249031,0.06140018515891296],[0.3631497363916856,0.2675892268514053,0.3692610367569091],2.0]
  63. [2.0,(4,[0,1,2,3],[0.333333,-0.0833334,0.254237,0.166667]),[0.09356562978983132,-0.35461894254035653,0.06140018515891296],[0.38356307461311073,0.2450150178255154,0.37142190756137383],0.0]
  64. [2.0,(4,[0,1,2,3],[0.388889,-0.333333,0.288136,0.0833333]),[0.06913857253127131,-0.34329784119526296,0.06140018515891296],[0.3767433655267083,0.2494174038961269,0.37383923057716484],0.0]
  65. [2.0,(4,[0,2,3],[-0.111111,0.288136,0.416667]),[0.16684647912977424,-0.48368449831398386,0.06140018515891296],[0.41293451997504027,0.2154562894240759,0.371609190600884],0.0]
  66. [2.0,(4,[0,2,3],[0.5,0.254237,0.0833333]),[0.06913857253127131,-0.31952216244181164,0.06140018515891296],[0.37449596811111574,0.25389487461233784,0.3716091572765465],0.0]
  67. ...
  68. accuracy is 0.8615384615384616
  69. weightedPrecision is 0.9017369727047146
  70. weightedRecall is 0.8615384615384616
  71. f1 is 0.8554924320962056
  72. ...

运行输出信息(runBaseOneVsRest)

  1. [0.0,(4,[0,1,2,3],[-0.666667,-0.583333,0.186441,0.333333]),0.0]
  2. [0.0,(4,[0,1,2,3],[-0.222222,-0.583333,0.355932,0.583333]),0.0]
  3. [0.0,(4,[0,1,2,3],[-0.111111,-0.166667,0.38983,0.416667]),0.0]
  4. [0.0,(4,[0,1,2,3],[-0.0555556,-0.166667,0.288136,0.416667]),0.0]
  5. [0.0,(4,[0,1,2,3],[0.0555554,-0.333333,0.288136,0.416667]),0.0]
  6. [0.0,(4,[0,1,2,3],[0.0555554,0.166667,0.491525,0.833333]),0.0]
  7. [0.0,(4,[0,1,2,3],[0.111111,-0.583333,0.355932,0.5]),0.0]
  8. [0.0,(4,[0,1,2,3],[0.111111,-0.25,0.559322,0.416667]),0.0]
  9. [0.0,(4,[0,1,2,3],[0.111111,0.0833333,0.694915,1.0]),0.0]
  10. [0.0,(4,[0,1,2,3],[0.166667,-0.416667,0.457627,0.5]),0.0]
  11. [0.0,(4,[0,1,2,3],[0.166667,-0.333333,0.559322,0.666667]),0.0]
  12. [0.0,(4,[0,1,2,3],[0.166667,-0.333333,0.559322,0.75]),0.0]
  13. [0.0,(4,[0,1,2,3],[0.166667,-0.0833334,0.525424,0.416667]),0.0]
  14. [0.0,(4,[0,1,2,3],[0.222222,-0.166667,0.423729,0.583333]),0.0]
  15. [0.0,(4,[0,1,2,3],[0.222222,-0.166667,0.525424,0.416667]),0.0]
  16. [0.0,(4,[0,1,2,3],[0.333333,-0.166667,0.423729,0.833333]),0.0]
  17. [0.0,(4,[0,1,2,3],[0.333333,-0.0833334,0.559322,0.916667]),0.0]
  18. [0.0,(4,[0,1,2,3],[0.333333,0.0833333,0.59322,0.666667]),0.0]
  19. [0.0,(4,[0,1,2,3],[0.444444,-0.0833334,0.38983,0.833333]),0.0]
  20. [0.0,(4,[0,1,2,3],[0.611111,-0.166667,0.627119,0.25]),0.0]
  21. [0.0,(4,[0,1,2,3],[0.611111,0.333333,0.728813,1.0]),0.0]
  22. [0.0,(4,[0,1,2,3],[0.833333,-0.166667,0.898305,0.666667]),0.0]
  23. [0.0,(4,[0,1,2,3],[0.888889,-0.5,1.0,0.833333]),0.0]
  24. [0.0,(4,[0,1,2,3],[0.888889,-0.333333,0.932203,0.583333]),0.0]
  25. [0.0,(4,[0,1,2,3],[0.888889,0.5,0.932203,0.75]),0.0]
  26. [0.0,(4,[0,1,2,3],[1.0,0.5,0.830508,0.583333]),0.0]
  27. [0.0,(4,[0,2,3],[0.611111,0.694915,0.416667]),0.0]
  28. [1.0,(4,[0,1,2,3],[-0.944444,-0.25,-0.864407,-0.916667]),1.0]
  29. [1.0,(4,[0,1,2,3],[-0.833333,-0.0833334,-0.830508,-0.916667]),1.0]
  30. [1.0,(4,[0,1,2,3],[-0.833333,0.166667,-0.864407,-0.833333]),1.0]
  31. [1.0,(4,[0,1,2,3],[-0.722222,-0.166667,-0.864407,-1.0]),1.0]
  32. [1.0,(4,[0,1,2,3],[-0.722222,-0.166667,-0.864407,-0.833333]),1.0]
  33. [1.0,(4,[0,1,2,3],[-0.722222,-0.0833334,-0.79661,-0.916667]),1.0]
  34. [1.0,(4,[0,1,2,3],[-0.722222,0.166667,-0.79661,-0.916667]),1.0]
  35. [1.0,(4,[0,1,2,3],[-0.722222,0.166667,-0.694915,-0.916667]),1.0]
  36. [1.0,(4,[0,1,2,3],[-0.666667,-0.0833334,-0.830508,-1.0]),1.0]
  37. [1.0,(4,[0,1,2,3],[-0.611111,-0.166667,-0.79661,-0.916667]),1.0]
  38. [1.0,(4,[0,1,2,3],[-0.611111,0.0833333,-0.864407,-0.916667]),1.0]
  39. [1.0,(4,[0,1,2,3],[-0.611111,0.333333,-0.864407,-0.916667]),1.0]
  40. [1.0,(4,[0,1,2,3],[-0.555556,0.166667,-0.830508,-0.916667]),1.0]
  41. [1.0,(4,[0,1,2,3],[-0.555556,0.5,-0.830508,-0.833333]),1.0]
  42. [1.0,(4,[0,1,2,3],[-0.555556,0.5,-0.694915,-0.75]),1.0]
  43. [1.0,(4,[0,1,2,3],[-0.5,0.166667,-0.864407,-0.916667]),1.0]
  44. [1.0,(4,[0,1,2,3],[-0.5,0.25,-0.830508,-0.916667]),1.0]
  45. [1.0,(4,[0,1,2,3],[-0.444444,0.416667,-0.830508,-0.916667]),1.0]
  46. [1.0,(4,[0,1,2,3],[-0.388889,0.166667,-0.830508,-0.75]),1.0]
  47. [1.0,(4,[0,1,2,3],[-0.388889,0.583333,-0.898305,-0.75]),1.0]
  48. [1.0,(4,[0,1,2,3],[-0.222222,0.5,-0.762712,-0.833333]),1.0]
  49. [1.0,(4,[0,1,2,3],[-0.222222,1.0,-0.830508,-0.75]),1.0]
  50. [1.0,(4,[0,1,2,3],[-0.166667,0.666667,-0.932203,-0.916667]),1.0]
  51. [1.0,(4,[0,2,3],[-0.777778,-0.79661,-0.916667]),1.0]
  52. [2.0,(4,[0,1,2,3],[-0.611111,-1.0,-0.152542,-0.25]),1.0]
  53. [2.0,(4,[0,1,2,3],[-0.611111,-0.75,-0.220339,-0.25]),1.0]
  54. [2.0,(4,[0,1,2,3],[-0.555556,-0.583333,-0.322034,-0.166667]),1.0]
  55. [2.0,(4,[0,1,2,3],[-0.333333,-0.75,0.0169491,-4.03573E-8]),2.0]
  56. [2.0,(4,[0,1,2,3],[-0.333333,-0.5,0.152542,-0.0833333]),2.0]
  57. [2.0,(4,[0,1,2,3],[-0.277778,-0.583333,-0.0169491,-0.166667]),2.0]
  58. [2.0,(4,[0,1,2,3],[-0.277778,-0.416667,0.0847457,-4.03573E-8]),2.0]
  59. [2.0,(4,[0,1,2,3],[-0.277778,-0.166667,0.0508474,-4.03573E-8]),2.0]
  60. [2.0,(4,[0,1,2,3],[-0.222222,-0.5,-0.152542,-0.25]),1.0]
  61. [2.0,(4,[0,1,2,3],[-0.222222,-0.333333,0.0508474,-4.03573E-8]),2.0]
  62. [2.0,(4,[0,1,2,3],[-0.222222,-0.25,0.0847457,-4.03573E-8]),2.0]
  63. [2.0,(4,[0,1,2,3],[-0.166667,-0.416667,0.0508474,-0.25]),2.0]
  64. [2.0,(4,[0,1,2,3],[-0.0555556,-0.833333,0.0169491,-0.25]),2.0]
  65. [2.0,(4,[0,1,2,3],[-0.0555556,-0.416667,0.38983,0.25]),0.0]
  66. [2.0,(4,[0,1,2,3],[-1.32455E-7,-0.333333,0.0169491,-4.03573E-8]),2.0]
  67. [2.0,(4,[0,1,2,3],[-1.32455E-7,-0.333333,0.254237,-0.0833333]),2.0]
  68. [2.0,(4,[0,1,2,3],[-1.32455E-7,-0.25,0.254237,0.0833333]),0.0]
  69. [2.0,(4,[0,1,2,3],[0.111111,-0.583333,0.322034,0.166667]),0.0]
  70. [2.0,(4,[0,1,2,3],[0.277778,-0.25,0.220339,-4.03573E-8]),0.0]
  71. [2.0,(4,[0,1,2,3],[0.333333,-0.166667,0.355932,0.333333]),0.0]
  72. [2.0,(4,[0,1,2,3],[0.333333,-0.0833334,0.152542,0.0833333]),0.0]
  73. [2.0,(4,[0,1,2,3],[0.333333,-0.0833334,0.254237,0.166667]),0.0]
  74. [2.0,(4,[0,1,2,3],[0.444444,-0.0833334,0.322034,0.166667]),0.0]
  75. [2.0,(4,[0,2,3],[-0.111111,0.288136,0.416667]),0.0]
  76. [2.0,(4,[0,2,3],[0.166667,0.186441,0.166667]),0.0]
  77. [2.0,(4,[0,2,3],[0.5,0.254237,0.0833333]),0.0]
  78. ...
  79. accuracy is 0.8051948051948052
  80. weightedPrecision is 0.8539693389317449
  81. weightedRecall is 0.8051948051948052
  82. f1 is 0.7797931797931799

搜索与推荐Wiki

扫一扫 关注微信公众号!号主 专注于搜索和推荐系统,尝试使用算法去更好的服务于用户,包括但不局限于机器学习,深度学习,强化学习,自然语言理解,知识图谱,还不定时分享技术,资料,思考等文章!


                             【技术服务】,详情点击查看:https://mp.weixin.qq.com/s/PtX9ukKRBmazAWARprGIAg 


声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/很楠不爱3/article/detail/659441
推荐阅读
相关标签
  

闽ICP备14008679号