By using this site, you acknowledge that you have read and understand our Cookie policy, Privacy policy and Terms .

Copy file from HDFS to local

Use the following command:

hadoop fs [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]

For example, copy a file from /hdfs-file.txt in HDFS to local /tmp/ using the following command:

hadoop fs -copyToLocal /hdfs-file.txt /tmp/hdfs-file.txt


If you forgot any HDFS commands, you can use the following command to list all of them:

hadoop fs

The output will generate all the commands (not all of them are implemented yet):

hadoop fs
Usage: hadoop fs [generic options]
         [-appendToFile <localsrc> ... <dst>]
         [-cat [-ignoreCrc] <src> ...]
         [-checksum <src> ...]
         [-chgrp [-R] GROUP PATH...]
         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
         [-chown [-R] [OWNER][:[GROUP]] PATH...]
         [-copyFromLocal [-f] [-p] [-l] [-d] [-t <thread count>] <localsrc> ... <dst>]
         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] [-e] <path> ...]
         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
         [-createSnapshot <snapshotDir> [<snapshotName>]]
         [-deleteSnapshot <snapshotDir> <snapshotName>]
         [-df [-h] [<path> ...]]
         [-du [-s] [-h] [-v] [-x] <path> ...]
         [-find <path> ... <expression> ...]
         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
         [-getfacl [-R] <path>]
         [-getfattr [-R] {-n name | -d} [-e en] <path>]
         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
         [-help [cmd ...]]
         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [-e] [<path> ...]]
         [-mkdir [-p] <path> ...]
         [-moveFromLocal <localsrc> ... <dst>]
         [-moveToLocal <src> <localdst>]
         [-mv <src> ... <dst>]
         [-put [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
         [-renameSnapshot <snapshotDir> <oldName> <newName>]
         [-rm [-f] [-r|-R] [-skipTrash] [-safely] <src> ...]
         [-rmdir [--ignore-fail-on-non-empty] <dir> ...]
         [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
         [-setfattr {-n name [-v value] | -x name} <path>]
         [-setrep [-R] [-w] <rep> <path> ...]
         [-stat [format] <path> ...]
         [-tail [-f] <file>]
         [-test -[defsz] <path>]
         [-text [-ignoreCrc] <src> ...]
         [-touchz <path> ...]
         [-truncate [-w] <length> <path> ...]
         [-usage [cmd ...]]

Generic options supported are:
-conf <configuration file>        specify an application configuration file
-D <property=value>               define a value for a given property
-fs <file:///|hdfs://namenode:port> specify default filesystem URL to use, overrides 'fs.defaultFS' property from configurations.
-jt <local|resourcemanager:port>  specify a ResourceManager
-files <file1,...>                specify a comma-separated list of files to be copied to the map reduce cluster
-libjars <jar1,...>               specify a comma-separated list of jar files to be included in the classpath
-archives <archive1,...>          specify a comma-separated list of archives to be unarchived on the compute machines

The general command line syntax is:
command [genericOptions] [commandOptions]

If you want to view the detailed syntax for any command, you can try the following command:

hadoop fs -help [command]

For example, run command ‘hadoop fs -help copyToLocal’ will generate the following output:

hadoop fs -help copyToLocal
-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst> :
   Identical to the -get command.

info Last modified by Raymond at 2 years ago * This page is subject to Site terms.

More from Kontext

local_offer jupyter-notebook local_offer hdfs

visibility 15
thumb_up 0
access_time 18 days ago

Jupyter notebook service can be started in most of operating system. In the system where Hadoop clients are available, you can also easily ingest data into HDFS (Hadoop Distributed File System) using HDFS CLIs.&nbsp; *Python 3 Kernel is used in the following examples. List files in H...

open_in_new View

local_offer hdfs local_offer hadoop local_offer windows

visibility 35
thumb_up 0
access_time 25 days ago

Network Attached Storage are commonly used in many enterprises where files are stored remotely on those servers.&nbsp; They typically provide access to files using network file sharing protocols such as&nbsp; ...

open_in_new View

local_offer hive local_offer hdfs

visibility 56
thumb_up 0
access_time 2 months ago

In Hive, there are two types of tables can be created - internal and external table. Internal tables are also called managed tables. Different features are available to different types. This article lists some of the common differences.&nbsp; Internal table By default, Hive creates ...

open_in_new View

Schema Merging (Evolution) with Parquet in Spark and Hive

local_offer parquet local_offer pyspark local_offer spark-2-x local_offer hive local_offer hdfs

visibility 246
thumb_up 0
access_time 3 months ago

Schema evolution is supported by many frameworks or data serialization systems such as Avro, Orc, Protocol Buffer and Parquet. With schema evolution, one set of data can be stored in multiple files with different but compatible schema. In Spark, Parquet data source can detect and merge sch...

open_in_new View

info About author

Kontext Column

Kontext Column

Created for everyone to publish data, programming and cloud related articles. Follow three steps to create your columns.

Learn more arrow_forward
info Follow us on Twitter to get the latest article updates. Follow us