赞
踩
spark3.0.1,scala 2.12使用foreachPartition报错:error: value foreach is not a member of Object
代码:
df.foreachPartition(partition=>partition.foreach(println))
报错:
value foreach is not a member of Object
df.foreachPartition(partition=>partition.foreach(println))
这个报错看不出什么东西,因为还不是真正的错误信息,将代码改为:
df.foreachPartition(partition=>println(1))
报错:
ambiguous reference to overloaded definition,
both method foreachPartition in class Dataset of type (func: org.apache.spark.api.java.function.ForeachPartitionFunction[org.apache.spark.sql.Row])Unit
and method foreachPartition in class Dataset of type (f: Iterator[org.apache.spark.sql.Row] => Unit)Unit
match argument types (Object => Unit)
df.foreachPartition(partition=>println(1))
这里说模糊引用了重载定义的方法foreachPartition,那我们就显示的指定foreachPartition里面的函数即可
将代码改为:
- df.foreachPartition((partition:Iterator[Row])=>{
- partition.foreach(println)
- })
执行成功!
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。