Hadoop系列:hdfs文件系统的基本操作

发表于:2018-5-25 09:23  作者:骑着龙的羊   来源:51testing软件测试网采编

字体: | 上一篇 | 下一篇 |我要投稿 | 推荐标签: 软件开发 hadoop 大数据 Hadoop

  可以执行所有常用的Linux文件操作命令(读取文件,新建文件,移动文件,删除文件,列表文件等)
  1.help命令获取没个命令的帮助
  [cloudera@quickstart ~]$ hadoop fs -help
  Usage: hadoop fs [generic options]
          [-appendToFile <localsrc> ... <dst>]
          [-cat [-ignoreCrc] <src> ...]
          [-checksum <src> ...]
          [-chgrp [-R] GROUP PATH...]
          [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
          [-chown [-R] [OWNER][:[GROUP]] PATH...]
          [-copyFromLocal [-f] [-p] [-l] <localsrc> ... <dst>]
          [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
          [-count [-q] [-h] [-v] [-x] <path> ...]
          [-cp [-f] [-p | -p[topax]] <src> ... <dst>]
          [-createSnapshot <snapshotDir> [<snapshotName>]]
          [-deleteSnapshot <snapshotDir> <snapshotName>]
          [-df [-h] [<path> ...]]
          [-du [-s] [-h] [-x] <path> ...]
          [-expunge]
          [-find <path> ... <expression> ...]
          [-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
          [-getfacl [-R] <path>]
          [-getfattr [-R] {-n name | -d} [-e en] <path>]
          [-getmerge [-nl] <src> <localdst>]
          [-help [cmd ...]]
          [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
          [-mkdir [-p] <path> ...]
          [-moveFromLocal <localsrc> ... <dst>]
          [-moveToLocal <src> <localdst>]
          [-mv <src> ... <dst>]
          [-put [-f] [-p] [-l] <localsrc> ... <dst>]
          [-renameSnapshot <snapshotDir> <oldName> <newName>]
          [-rm [-f] [-r|-R] [-skipTrash] <src> ...]
          [-rmdir [--ignore-fail-on-non-empty] <dir> ...]
          [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
          [-setfattr {-n name [-v value] | -x name} <path>]
          [-setrep [-R] [-w] <rep> <path> ...]
          [-stat [format] <path> ...]
          [-tail [-f] <file>]
          [-test -[defsz] <path>]
          [-text [-ignoreCrc] <src> ...]
          [-touchz <path> ...]
          [-usage [cmd ...]]
  2.copyFromLocal复制本地文件到hdfs中,其中“hdfs://quickstart.cloudera:8020”可以省略
  [cloudera@quickstart Downloads]$ hadoop fs -copyFromLocal file1.txt hdfs://quickstart.cloudera:8020/tmp
  [cloudera@quickstart Downloads]$ hadoop fs -copyFromLocal file2.txt /tmp
  3.copyToLocal把hdfs中文件复制到本地文件系统
  [cloudera@quickstart Downloads]$ hadoop fs -copyToLocal hdfs://quickstart.cloudera:8020/tmp/file1.txt file1.txt.copy
  4.ls列出当前目录下的文件(第一列:文件权限,第二列:文件的备份书,第三列:所属用户,第四列:所属组,第五列:文件大小,第六列:最后修改日期,第七列:文件或目录)
  [cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
  Found 7 items
  drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 15:40 /tmp/.cloudera_health_monitoring_canary_files
  -rw-r--r--   1 cloudera supergroup         12 2018-04-26 15:35 /tmp/file1.txt
  -rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
  drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
  drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
  drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
  drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
  5.cat输出文件内容
  [cloudera@quickstart Downloads]$ hadoop fs -cat /tmp/file1.txt
  hello world
  6.mkdir创建目录
  [cloudera@quickstart Downloads]$ hadoop fs -mkdir /tmp/test
  [cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
  Found 8 items
  drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 15:54 /tmp/.cloudera_health_monitoring_canary_files
  -rw-r--r--   1 cloudera supergroup         12 2018-04-26 15:35 /tmp/file1.txt
  -rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
  drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
  drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
  drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
  drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
  drwxr-xr-x   - cloudera supergroup          0 2018-04-26 15:54 /tmp/test
  7.rm删除文件或者目录
  [cloudera@quickstart Downloads]$ hadoop fs -rm /tmp/file1.txt      删除文件
  18/04/26 15:55:44 INFO fs.TrashPolicyDefault: Moved: 'hdfs://quickstart.cloudera:8020/tmp/file1.txt' to trash at: hdfs://quickstart.cloudera:8020/user/cloudera/.Trash/Current/tmp/file1.txt
  [cloudera@quickstart Downloads]$ hadoop fs -rm -r /tmp/test        删除目录
  18/04/26 15:56:01 INFO fs.TrashPolicyDefault: Moved: 'hdfs://quickstart.cloudera:8020/tmp/test' to trash at: hdfs://quickstart.cloudera:8020/user/cloudera/.Trash/Current/tmp/test
  8.put同copyFromLocal
  [cloudera@quickstart Downloads]$ hadoop fs -put file1.txt /tmp
  9.get通copyToLocal
  [cloudera@quickstart Downloads]$ hadoop fs -get hdfs://quickstart.cloudera:8020/tmp/file1.txt get1.txt
  [cloudera@quickstart Downloads]$ ls
  1901.gz  1902.gz  all  compute_max_degree.sh  file1.txt  file1.txt.copy  file2.txt  get1.txt
  10.mv移动文件
  [cloudera@quickstart Downloads]$ hadoop fs -mv /tmp/file1.txt /tmp/file1_new.txt
  [cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
  Found 8 items
  drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 16:08 /tmp/.cloudera_health_monitoring_canary_files
  -rw-r--r--   1 cloudera supergroup         12 2018-04-26 16:05 /tmp/file1_new.txt
  -rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
  drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
  drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
  drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
  drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
  drwxr-xr-x   - cloudera supergroup          0 2018-04-26 16:04 /tmp/test
  11.du显示文件大小
  [cloudera@quickstart Downloads]$ hadoop fs -du /tmp/file2.txt
  29  29  /tmp/file2.txt
  12.touchz创建空文件
  [cloudera@quickstart Downloads]$ hadoop fs -touchz /tmp/file3.txt
  [cloudera@quickstart Downloads]$ hadoop fs -du /tmp/file3.txt
  0  0  /tmp/file3.txt
  13.chmod改变文件权限
  [cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
  Found 1 items
  -rw-r--r--   1 cloudera supergroup          0 2018-04-26 16:12 /tmp/file3.txt
  [cloudera@quickstart Downloads]$ hadoop fs -chmod +x /tmp/file3.txt
  [cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
  Found 1 items-rwxr-xr-x   1 cloudera supergroup          0 2018-04-26 16:12 /tmp/file3.txt
  14.chown改变文件所有者
  [cloudera@quickstart Downloads]$ hadoop fs -chown -R hbase:supergroup /tmp/file3.txt
  chown: changing ownership of '/tmp/file3.txt': Non-super user cannot change owner
  [cloudera@quickstart Downloads]$ sudo -u hdfs hadoop fs -chown -R hbase:supergroup /tmp/file3.txt
  [cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
  Found 9 items
  drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 16:19 /tmp/.cloudera_health_monitoring_canary_files
  -rw-r--r--   1 cloudera supergroup         12 2018-04-26 16:05 /tmp/file1_new.txt
  -rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
  -rwxr-xr-x   1 hbase    supergroup          0 2018-04-26 16:12 /tmp/file3.txt
  drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
  drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
  drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
  drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
  drwxr-xr-x   - cloudera supergroup          0 2018-04-26 16:04 /tmp/test


上文内容不用于商业目的,如涉及知识产权问题,请权利人联系博为峰小编(021-64471599-8017),我们将立即处理。

【有奖活动】填问卷 送70G测试大牛精品资料(干货+视频)!

评 论

论坛新帖

顶部 底部


建议使用IE 6.0以上浏览器,800×600以上分辨率,法律顾问:上海瀛东律师事务所 张楠律师
版权所有 上海博为峰软件技术股份有限公司 Copyright©51testing.com 2003-2018, 沪ICP备05003035号
投诉及意见反馈:webmaster@51testing.com; 业务联系:service@51testing.com 021-64471599-8017

沪公网安备 31010102002173号

51Testing官方微信

51Testing官方微博

扫一扫 测试知识全知道