access_time 4 months ago languageEnglish
more_vert

Hadoop Daemon Log Files Location

visibility 43 comment 0

About Hadoop daemon services

To start a Hadoop service, we use scripts under sbin folder of Hadoop home folder. 

For example, the following two commands start Hadoop HDFS services (namenode and datanode) on Windows and UNIX-alike systems respectively. 

%HADOOP_HOME%\sbin\start-dfs.cmd
%HADOOP_HOME%\sbin\start-dfs.sh
Sometimes, these daemon services may fail to start. To investigate the root cause, we need to look into log folder. 

HADOOP_LOG_DIR

This environment variable is used for Hadoop log directory.

By default the values are:

  • Windows:
    @rem Where log files are stored.  %HADOOP_HOME%/logs by default.
    @rem set HADOOP_LOG_DIR=%HADOOP_LOG_DIR%\%USERNAME%

    The logs are located in folder %HADOOP_HOME%/logs.

  • Linux:
    # Where (primarily) daemon log files are stored.
    # ${HADOOP_HOME}/logs by default.
    # Java property: hadoop.log.dir
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    The logs are located in folder ${HADOOP_HOME}/logs.

copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn.

Want to contribute on Kontext to help others?

Learn more

More from Kontext

visibility 50
thumb_up 0
access_time 2 years ago
Install Hadoop 3.3.0 on macOS
visibility 923
thumb_up 0
access_time 5 months ago
visibility 548
thumb_up 0
access_time 2 years ago
Install Hadoop 3.3.0 on Windows 10 using WSL
visibility 7495
thumb_up 12
access_time 10 months ago
Apache Hive 3.1.2 Installation on Linux Guide
visibility 1425
thumb_up 0
access_time 5 months ago
visibility 17880
thumb_up 2
access_time 4 years ago