赞
踩
使用Python的MrJob写了MapReduce任务后,运行MapReduce任务的时候报错如下:经过排查发现,这是由于hadoop刚刚启动,还处于安全模式下,因此执行MapReduce任务会报错,等待一会,待hadoop安全正常的启动后,再执行同样的任务,就正常了。
如果还是报同样的错误,那么你可以离开安全模式,操作方法如下:bin/hadoop dfsadmin -safemode leave。
dfsadmin常用命令如下:leave(离开安全模式)、enter(进入安全模式)、get(获取当前模式状态)、wait(等待,知道安全模式结束)。
- # python /home/test/hadoop/httpflow.py -r hadoop --jobconf mapreduce.job.priority=VERY_HIGH -o hdfs://data/hadoop/output/httpflow hdfs:///data/hadoop/data/tmp20180802
- No configs found; falling back on auto-configuration
- No configs specified for hadoop runner
- Looking for hadoop binary in $PATH...
- Found hadoop binary: /usr/local/soft/hadoop-2.6.5/bin/hadoop
- Using Hadoop version 2.6.5
- Looking for Hadoop streaming jar in /usr/local/soft/hadoop-2.6.5...
- Found Hadoop streaming jar: /usr/local/soft/hadoop-2.6.5/share/hadoop/tools/lib/hadoop-streaming-2.6.5.jar
- Creating temp directory /tmp/httpflow.root.20180802.023651.138363
- STDERR: mkdir: Cannot create directory /user/root/tmp/mrjob/httpflow.root.20180802.023651.138363/files. Name node is in safe mode.
- Traceback (most recent call last):
- File "/root/.virtualenvs/env3/lib/python3.6/site-packages/mrjob/fs/hadoop.py", line 286, in mkdir
- self.invoke_hadoop(args, ok_stderr=[_HADOOP_FILE_EXISTS_RE])
- File "/root/.virtualenvs/env3/lib/python3.6/site-packages/mrjob/fs/hadoop.py", line 173, in invoke_hadoop
- raise CalledProcessError(proc.returncode, args)
- subprocess.CalledProcessError: Command '['/usr/local/soft/hadoop-2.6.5/bin/hadoop', 'fs', '-mkdir', '-p', 'hdfs:///user/root/tmp/mrjob/httpflow.root.20180802.023651.138363/files/']' returned non-zero exit status 1.
-
- During handling of the above exception, another exception occurred:
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。