当前位置:   article > 正文

Hadoop本地运行MapReduce报错:Caused by: java.io.FileNotFoundException_java.io.filenotfoundexception: hadoop bin director

java.io.filenotfoundexception: hadoop bin directory does not exist

Hadoop本地运行MapReduce报错:Caused by: java.io.FileNotFoundException

转载:https://blog.csdn.net/qq_41826265/article/details/108336319?utm_medium=distribute.pc_relevant.none-task-blog-baidujs_title-0&spm=1001.2101.3001.4242

java.lang.Exception: org.apache.hadoop.mapreduce.task.reduce.Shuffle$ShuffleError: error in shuffle in localfetcher#1
	at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
	at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.hadoop.mapreduce.task.reduce.Shuffle$ShuffleError: error in shuffle in localfetcher#1
	at org.apache.hadoop.mapreduce.task.reduce.Shuffle.run(Shuffle.java:134)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:376)
	at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: E:/bigdata/hdfs_temp/mapred/local/localRunner/Ruby%20Wang/jobcache/job_local1561914882_0001/attempt_local1561914882_0001_m_000000_0/output/file.out.index
	at org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:200)
	at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:767)
	at org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156)
	at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:71)
	at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:62)
	at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:57)
	at org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.copyMapOutput(LocalFetcher.java:124)
	at org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.doCopy(LocalFetcher.java:102)
	at org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.run(LocalFetcher.java:85)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22

这个问题主要原因可以看一下后边那个报错

Caused by: java.io.FileNotFoundException: E:/bigdata/hdfs_temp/mapred/local/localRunner/Ruby%20Wang/jobcache/job_local1561914882_0001/attempt_local1561914882_0001_m_000000_0/output/file.out.index 
  • 1

说是文件找不到异常
根据大佬的解析 一个是路径问题 一个是没有这个文件
这里发现这个文件是自动创建的 那么就是路径问题

众所周知
hadoop路径里是不能有空格的 然而这个自创的路径里有空格 于是就要更改这个路径

大佬解决办法:Hadoop本地运行出现:Caused by: java.io.FileNotFoundException:D:/tmp/hadoop-win2010/mapred/local/

具体:
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/代码探险家/article/detail/976640?site
推荐阅读
相关标签
  

闽ICP备14008679号