赞
踩
场景:
项目组6个节点中,某一节点硬盘寿命到了,由于当时硬盘做的raid0,没有备份,更换新硬盘后,启动datanode遇到如下问题:
1. 更换硬盘前,停止datanode,停止在此节点上应用;
2. 停止机器,更换硬盘
3. 重启机器,重启datanode,遇到问题
3.1
WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /data :
- WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /data :
- EPERM: Operation not permitted
- at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
- at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
- at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:729)
- at org.apache.hadoop.fs.ChecksumFileSystem$1.apply(ChecksumFileSystem.java:505)
- at org.apache.hadoop.fs.ChecksumFileSystem$FsOperation.run(ChecksumFileSystem.java:486)
- at org.apache.hadoop.fs.Ch
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。