access_time 2 years ago languageEnglish
more_vert

Ingest Data into Hadoop HDFS through Jupyter Notebook

visibility 2,794 comment 0

Jupyter notebook service can be started in most of operating system. In the system where Hadoop clients are available, you can also easily ingest data into HDFS (Hadoop Distributed File System) using HDFS CLIs. 

*Python 3 Kernel is used in the following examples.

List files in HDFS

The following command shows how to list files in HDFS.

!hadoop fs -ls /

Output


Ingest file into HDFS

HDFS copyFromLocal option can be used to copy file from local to HDFS. You can also use Python variables in the commands.

local_file_path = "/home/tangr/jupyter-notebooks/csharp-example.ipynb"
!hadoop fs -copyFromLocal $local_file_path /

Output

As shown in the following screenshot, a local file named csharp-example.ipynb was ingested into HDFS root folder: /csharp-example.ipynb.

Other HDFS commands

You can also use other commands in Jupyter notebook. For example, download HDFS file into local storage and then parse or read the file using native functions.

copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn.

Want to contribute on Kontext to help others?

Learn more

More from Kontext

visibility 4402
thumb_up 1
access_time 4 years ago
visibility 718
thumb_up 1
access_time 2 years ago
visibility 2168
thumb_up 0
access_time 2 years ago