Resolve Hadoop RemoteException - Name node is in safe mode

visibility 847 access_time 5 years ago languageEnglish timeline Stats
timeline Stats
Page index 0.57

In Safe Mode, the HDFS cluster is read-only. After completion of block replication maintenance activity, the name node leaves safe mode automatically.

If you try to delete files in safe mode, the following exception may raise:

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /user/hadoop/sqoop_test/blogs. Name node is in safe mode.

The above exception occurred because I was using Sqoop to load files into HDFS while deleting existing files.

We can also manually leave safe mode by using the following command:

hadoop@hdp-master:/hadoop> hdfs dfsadmin -safemode leave
Safe mode is OFF

info Last modified by Raymond 5 years ago copyright This page is subject to Site terms.

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

More from Kontext
List Hadoop running jobs in Shell / Hadoop
visibility 199
thumb_up 0
access_time 3 years ago
Ingest Data into Hadoop HDFS through Jupyter Notebook
visibility 4,650
thumb_up 2
access_time 3 years ago