赞
踩
hadoop fs -ls /
普通创建:hadoop fs -mkdir /xiaolin
递归创建:hadoop fs -mkdir -p /xiaolin/xiaoyin
mkdir xuan.txt
hadoop fs -moveFromLocal xuan.txt /xiaolin
hadoop fs -copyFromLocal xuan.txt /
hadoop fs -put xuan.txt /
-f
hadoop fs -put -f xuan.txt /
hadoop fs -copyToLocal /xiaolin ./
hadoop fs -get /xiaolin ./
hadoop fs -mkdir /xiaona
hadoop fs -cp /xiaolin/xuan.txt /xiaona/
vim zero.txt
hadoop fs -appendToFile zero.txt /xiaolin/xuan.txt
hadoop fs -cat /xiaolin/xuan.txt
hadoop fs -chmod 777 /xiaolin/xuan.txt
hadoop fs -mv /xiaolin/xuan.txt /xiaolin/xiaoyin
hadoop fs -getmerge /xiaolin/* ./
hadoop fs -rm /xiaolin/xiaoyin
hadoop fs -setrep 5 /xiaolin/xuan.txt
Usage:hdfs dfs -du [-s] [-h] URI [URI …]
-s选项将显示文件长度的汇总摘要,而不是单个文件。
-h选项将以“人类可读”的方式格式化文件大小(例如64.0m而不是67108864)
hdfs dfs -touchz URI [URI …]
Usage:hdfs dfs -stat URI [URI …]
(%b),文件名(%n),块大小(%n),复制数(%r),修改时间(%y%Y)
Usage:hdfs dfs -tail [-f] URI
Usage: hdfs dfs -count [-q] [-h] < paths>
Usage:hdfs dfs -cat < srcpath> | grep 过滤字段
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。