access_time 7 months ago languageEnglish
more_vert

Hadoop Daemon Log Files Location

visibility 71 comment 0

About Hadoop daemon services

To start a Hadoop service, we use scripts under sbin folder of Hadoop home folder. 

For example, the following two commands start Hadoop HDFS services (namenode and datanode) on Windows and UNIX-alike systems respectively. 

%HADOOP_HOME%\sbin\start-dfs.cmd
%HADOOP_HOME%\sbin\start-dfs.sh
Sometimes, these daemon services may fail to start. To investigate the root cause, we need to look into log folder. 

HADOOP_LOG_DIR

This environment variable is used for Hadoop log directory.

By default the values are:

  • Windows:
    @rem Where log files are stored.  %HADOOP_HOME%/logs by default.
    @rem set HADOOP_LOG_DIR=%HADOOP_LOG_DIR%\%USERNAME%

    The logs are located in folder %HADOOP_HOME%/logs.

  • Linux:
    # Where (primarily) daemon log files are stored.
    # ${HADOOP_HOME}/logs by default.
    # Java property: hadoop.log.dir
    # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs

    The logs are located in folder ${HADOOP_HOME}/logs.

copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn.

Want to contribute on Kontext to help others?

Learn more