先上错误:
14/06/27 23:37:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/06/27 23:37:32 ERROR security.UserGroupInformation: PriviledgedActionException as:Administrator cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700 Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682) at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:655) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850) at org.apache.hadoop.mapreduce.Job.submit(Job.java:500) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) at com.hadoop.learn.test.WordCountTest.main(WordCountTest.java:85)
一看到这个错误,对于第一次接触的新手来说傻眼了,跑的第一个map/reduc程序就这样,后面百度了很 多。问了很多大神,知道这是在windows下面跑的原因(用的是虚拟机搭建的环境),如果是在linux下面 跑的话就不会出现这个错误,后面试了很多种方法,只有下面这一种感觉是最简单的,而且是成功了。贴出 来,大家一起学习。
注意: 以下修改的地方均是在eclipse的菜单栏windows下面的preferences的Hadoop Map/Reduce
Hadoop installation directory 关联到的windows下面新建的一个文件夹(里面放的是在linux下面安装 的hadoop文件)进行修改(语文有点差,不知道表达清楚了没有):
解决方法是,修改/hadoop-1.0.2/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue,注释掉即可:
12345678910111213 | ...... private static void checkReturnValue(boolean rv, File p, FsPermission permission ) throws IOException { /** if (!rv) { throw new IOException("Failed to set permissions of path: " + p + " to " + String.format("%04o", permission.toShort())); } **/ } ...... |
view rawFileUtil.java hosted with ❤ by GitHub
重新编译打包hadoop-core-1.0.2.jar,替换掉hadoop-1.0.2根目录下的hadoop-core-1.0.2.jar即可。
这里提供一份修改版的hadoop-core-1.0.2-modified.jar文件,替换原hadoop-core-1.0.2.jar即可。
替换之后,刷新项目,设置好正确的jar包依赖,现在再运行WordCountTest,即可。
成功之后,在Eclipse下刷新HDFS目录,可以看到生成了ouput_tms目录:
控制台输出下面的东西,和linux下面跑jar包是一样的效果:
好了,可以了,第一个手敲mapreduce程序终于可以跑起来了,继续干!!!