Check HDFS folder size in Shell / Hadoop

access_time 2 years ago visibility96 comment 0

Hadoop provides a number of CLIs that can be used to perform many tasks/activities. This code snippet shows you how to check file/folder size in HDFS.

hdfs dfs command can be used to check disk usage in HDFS.


hdfs dfs -du -h ${hdfs_path}

Code snippet

hdfs dfs -du -h /path/to/your/folder
info Last modified by Raymond at 2 years ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Want to publish your article on Kontext?

Learn more

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.

Learn more arrow_forward

More from Kontext

local_offer hadoop local_offer hdfs

visibility 1347
thumb_up 0
access_time 2 years ago

Use the following command: hadoop fs [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>] For example, copy a file from /hdfs-file.txt in HDFS to local /tmp/ using the following command: hadoop fs -copyToLocal /hdfs-file.txt /tmp/hdfs-file.txt If you forgot any HDFS ...

local_offer zeppelin local_offer spark local_offer hadoop local_offer rdd local_offer spark-file-operations

visibility 6777
thumb_up 0
access_time 3 years ago

This page provides an example to load text file from HDFS through SparkContext in Zeppelin (sc). The details about this method can be found at: ...

local_offer hadoop local_offer windows10

visibility 246
thumb_up 0
access_time 2 months ago

Winutils is required when installing Hadoop on Windows environment. I've compiled Hadoop 3.3.0 on Windows 10 using CMake and Visual Studio (MSVC x64). Follow these two steps to download it: Go to subfolder  hadoop-3.3.0/ bin   to download ...

About column

Code snippets for various programming languages/frameworks.

rss_feed Subscribe RSS