你好,游客 登录
背景:
阅读新闻

Hadoop系列:(一)hdfs文件系统的基本操作

[日期:2018-04-27] 来源:博客园精华区  作者: [字体: ]

可以执行所有常用的Linux文件操作命令(读取文件,新建文件,移动文件,删除文件,列表文件等)

1.help命令获取没个命令的帮助

[[email protected] ~]$ hadoop fs -help
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] <localsrc> ... <dst>]
        [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]
        [-moveToLocal <src> <localdst>]
        [-mv <src> ... <dst>]
        [-put [-f] [-p] [-l] <localsrc> ... <dst>]
        [-renameSnapshot <snapshotDir> <oldName> <newName>]
        [-rm [-f] [-r|-R] [-skipTrash] <src> ...]
        [-rmdir [--ignore-fail-on-non-empty] <dir> ...]
        [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
        [-setfattr {-n name [-v value] | -x name} <path>]
        [-setrep [-R] [-w] <rep> <path> ...]
        [-stat [format] <path> ...]
        [-tail [-f] <file>]
        [-test -[defsz] <path>]
        [-text [-ignoreCrc] <src> ...]
        [-touchz <path> ...]
        [-usage [cmd ...]]

2.copyFromLocal复制本地文件到hdfs中,其中“hdfs://quickstart.cloudera:8020”可以省略

[[email protected] Downloads]$ hadoop fs -copyFromLocal file1.txt hdfs://quickstart.cloudera:8020/tmp
[[email protected] Downloads]$ hadoop fs -copyFromLocal file2.txt /tmp

3.copyToLocal把hdfs中文件复制到本地文件系统

[[email protected] Downloads]$ hadoop fs -copyToLocal hdfs://quickstart.cloudera:8020/tmp/file1.txt file1.txt.copy

4.ls列出当前目录下的文件(第一列:文件权限,第二列:文件的备份书,第三列:所属用户,第四列:所属组,第五列:文件大小,第六列:最后修改日期,第七列:文件或目录)

[cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
Found 7 items
drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 15:40 /tmp/.cloudera_health_monitoring_canary_files
-rw-r--r--   1 cloudera supergroup         12 2018-04-26 15:35 /tmp/file1.txt
-rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs

5.cat输出文件内容

[[email protected] Downloads]$ hadoop fs -cat /tmp/file1.txt
hello world

6.mkdir创建目录

[cloudera@quickstart Downloads]$ hadoop fs -mkdir /tmp/test
[cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
Found 8 items
drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 15:54 /tmp/.cloudera_health_monitoring_canary_files
-rw-r--r--   1 cloudera supergroup         12 2018-04-26 15:35 /tmp/file1.txt
-rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
drwxr-xr-x   - cloudera supergroup          0 2018-04-26 15:54 /tmp/test

7.rm删除文件或者目录

[cloudera@quickstart Downloads]$ hadoop fs -rm /tmp/file1.txt      删除文件
18/04/26 15:55:44 INFO fs.TrashPolicyDefault: Moved: 'hdfs://quickstart.cloudera:8020/tmp/file1.txt' to trash at: hdfs://quickstart.cloudera:8020/user/cloudera/.Trash/Current/tmp/file1.txt
[cloudera@quickstart Downloads]$ hadoop fs -rm -r /tmp/test        删除目录
18/04/26 15:56:01 INFO fs.TrashPolicyDefault: Moved: 'hdfs://quickstart.cloudera:8020/tmp/test' to trash at: hdfs://quickstart.cloudera:8020/user/cloudera/.Trash/Current/tmp/test

8.put同copyFromLocal

[[email protected] Downloads]$ hadoop fs -put file1.txt /tmp

9.get通copyToLocal

[[email protected] Downloads]$ hadoop fs -get hdfs://quickstart.cloudera:8020/tmp/file1.txt get1.txt
[[email protected] Downloads]$ ls
1901.gz  1902.gz  all  compute_max_degree.sh  file1.txt  file1.txt.copy  file2.txt  get1.txt

10.mv移动文件

[cloudera@quickstart Downloads]$ hadoop fs -mv /tmp/file1.txt /tmp/file1_new.txt
[cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
Found 8 items
drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 16:08 /tmp/.cloudera_health_monitoring_canary_files
-rw-r--r--   1 cloudera supergroup         12 2018-04-26 16:05 /tmp/file1_new.txt
-rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
drwxr-xr-x   - cloudera supergroup          0 2018-04-26 16:04 /tmp/test

11.du显示文件大小

[cloudera@quickstart Downloads]$ hadoop fs -du /tmp/file2.txt
29  29  /tmp/file2.txt

12.touchz创建空文件

[cloudera@quickstart Downloads]$ hadoop fs -touchz /tmp/file3.txt
[cloudera@quickstart Downloads]$ hadoop fs -du /tmp/file3.txt
0  0  /tmp/file3.txt

13.chmod改变文件权限

[[email protected] Downloads]$ hadoop fs -ls /tmp
Found 1 items
-rw-r--r--   1 cloudera supergroup          0 2018-04-26 16:12 /tmp/file3.txt
[[email protected] Downloads]$ hadoop fs -chmod +x /tmp/file3.txt
[[email protected] Downloads]$ hadoop fs -ls /tmp
Found 1 items-rwxr-xr-x   1 cloudera supergroup          0 2018-04-26 16:12 /tmp/file3.txt

14.chown改变文件所有者

[cloudera@quickstart Downloads]$ hadoop fs -chown -R hbase:supergroup /tmp/file3.txt
chown: changing ownership of '/tmp/file3.txt': Non-super user cannot change owner
[cloudera@quickstart Downloads]$ sudo -u hdfs hadoop fs -chown -R hbase:supergroup /tmp/file3.txt
[cloudera@quickstart Downloads]$ hadoop fs -ls /tmp
Found 9 items
drwxrwxrwx   - hdfs     supergroup          0 2018-04-26 16:19 /tmp/.cloudera_health_monitoring_canary_files
-rw-r--r--   1 cloudera supergroup         12 2018-04-26 16:05 /tmp/file1_new.txt
-rw-r--r--   1 cloudera supergroup         29 2018-04-26 15:36 /tmp/file2.txt
-rwxr-xr-x   1 hbase    supergroup          0 2018-04-26 16:12 /tmp/file3.txt
drwxrwxrwt   - mapred   mapred              0 2018-04-13 21:30 /tmp/hadoop-yarn
drwx--x--x   - hbase    supergroup          0 2018-04-12 15:36 /tmp/hbase-staging
drwx-wx-wx   - hive     supergroup          0 2018-04-13 19:02 /tmp/hive
drwxrwxrwt   - mapred   hadoop              0 2018-04-12 17:05 /tmp/logs
drwxr-xr-x   - cloudera supergroup          0 2018-04-26 16:04 /tmp/test




收藏 推荐 打印 | 录入:Cstor | 阅读:
本文评论   查看全部评论 (0)
表情: 表情 姓名: 字数
点评:
       
评论声明
  • 尊重网上道德,遵守中华人民共和国的各项有关法律法规
  • 承担一切因您的行为而直接或间接导致的民事或刑事法律责任
  • 本站管理人员有权保留或删除其管辖留言中的任意内容
  • 本站有权在网站内转载或引用您的评论
  • 参与本评论即表明您已经阅读并接受上述条款