当前位置:   article > 正文

python和pyspark_环境变量PYSPARK_PYTHON和PYSPARK_DRIVER_PYTHON

pythonrdd[2] error executor: exception in task 1.0 in stage 0.0 (tid 1) conn

I have installed pyspark recently. It was installed correctly. When I am using following simple program in python, I am getting an error.

>>from pyspark import SparkContext

>>sc = SparkContext()

>>data = range(1,1000)

>>rdd = sc.parallelize(data)

>>rdd.collect()

while running the last line I am getting error whose key line seems to be

[Stage 0:> (0 + 0) / 4]18/01/15 14:36:32 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)

org.apache.spark.api.python.PythonException: Traceback (most recent call last):

File "/usr/local/lib/python3.5/dist-packages/pyspark/python/lib/pyspark.zip/pyspark/worker.py", line 123, in main

("%d.%d" % sys.version_info[:2], version))

Exception: Python in worker has different version 2.7 than that in driver 3.5, Py

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/你好赵伟/article/detail/635051
推荐阅读
相关标签
  

闽ICP备14008679号