This page summarizes the default ports used by Hadoop services. It is useful when configuring network interfaces in a cluster.

Hadoop 3.1.0

HDFS

The secondary namenode http/https server address and port.
Service Servers Default Ports UsedProtocolConfiguration ParameterComments

WebUI for NameNode

Master (incl. back-up NameNodes) 9870/9871http/https

dfs.namenode.http-address

dfs.namenode.https-address

The address and the base port where the dfs namenode web ui will listen on.

The namenode secure http server address and port.

Metadata service (NameNode) Master (incl. back-up NameNodes) IPC

fs.defaultFS

The name of the default file system.

For example,

hdfs://hdp-master:19000

Data NodeAll slave nodes9864/9865http/https

dfs.datanode.http.address

dfs.datanode.https.address

The secondary namenode http/https server address and port.

Data NodeAll slave nodes9866dfs.datanode.address

The datanode server address and port for data transfer.

Data NodeAll slave nodes9867IPCdfs.datanode.ipc.address

The datanode ipc server address and port (for metadata operations).

Secondary NameNodeSecondary NameNode and any backup NameNodes9868/9869http/https

dfs.namenode.secondary.http-address

fs.namenode.secondary.https-address

JournalNode8485IPCdfs.journalnode.rpc-addressThe JournalNode RPC server address and port.
JournalNode8480/8481http/https

dfs.journalnode.http-address

dfs.journalnode.https-address

The address and port the JournalNode http/https server listens on.
Aliasmap Server

NameNode 50200

dfs.provided.aliasmap.inmemory.dnrpc-address
The address where the aliasmap server will be running

MapReduce

Service Servers Default Ports Used Protocol Configuration Parameter Comments
MapReduce Job History 10020
mapreduce.jobhistory.address MapReduce JobHistory Server IPC host:port
MapReduce Job History UI 19888/19890 http/https

mapreduce.jobhistory.webapp.address

mapreduce.jobhistory.webapp.https.address

MapReduce JobHistory Server Web UI URL (http/https)

History server admin

10033
http
mapreduce.jobhistory.admin.address
The address of the History server admin interface.


YARN

Service Servers Default Ports Used Protocol Configuration Parameter Comments

8032 http yarn.resourcemanager.address The address of the applications manager interface in the RM.
8030 http yarn.resourcemanager.scheduler.address The address of the scheduler interface.


8088/8090
http/https

yarn.resourcemanager.webapp.address

yarn.resourcemanager.webapp.https.address

The http/https address of the RM web application.


8031

yarn.resourcemanager.resource-tracker.address



8033

yarn.resourcemanager.admin.address
The address of the RM admin interface.


0

yarn.nodemanager.address
The address of the container manager in the NM.


8040

yarn.nodemanager.localizer.address
Address where the localizer IPC is.


8048

yarn.nodemanager.collector-service.address
Address where the collector service IPC is.


8042/8044
http/https

yarn.nodemanager.webapp.address

yarn.nodemanager.webapp.https.address

The http/https address of the NM web application.


10200

yarn.timeline-service.address
This is default address for the timeline server to start the RPC server.


8188/8190

yarn.timeline-service.webapp.address

yarn.timeline-service.webapp.https.address

The http/https address of the timeline service web application.


8047

yarn.sharedcache.admin.address
The address of the admin interface in the SCM (shared cache manager)


8788

yarn.sharedcache.webapp.address
The address of the web application in the SCM (shared cache manager)


8046

yarn.sharedcache.uploader.server.address
The address of the node manager interface in the SCM (shared cache manager)


8045

yarn.sharedcache.client-server.address
The address of the client interface in the SCM (shared cache manager)


8049

yarn.nodemanager.amrmproxy.address
The address of the AMRMProxyService listener.


8089/8091

yarn.router.webapp.address

yarn.router.webapp.https.address

The http/https address of the Router web application.






References

The following links provide information about all the default configurations for Hadoop v3.1.0.


info Last modified by Raymond at 3 years ago copyright This page is subject to Site terms.

More from Kontext

local_offer linux local_offer hadoop local_offer hdfs local_offer yarn

visibility 56
thumb_up 0
access_time 5 days ago

This article provides step-by-step guidance to install Hadoop 3.3.0 on Linux such as Debian, Ubuntu, Red Hat, openSUSE, etc.  Hadoop 3.3.0 was released on July 14 2020. It is the first release of Apache Hadoop 3.3...

open_in_new Hadoop

Install Hadoop 3.3.0 on Windows 10 Step by Step Guide

local_offer windows10 local_offer hadoop local_offer yarn local_offer hdfs

visibility 195
thumb_up 0
access_time 8 days ago

This detailed step-by-step guide shows you how to install the latest Hadoop v3.3.0 on Windows 10. It leverages Hadoop 3.3.0 winutils tool and WSL is not required. This version was released on July 14 2020. It is the first release of Apache Hadoop 3.3 line. There are significant changes compared with Hadoop 3.2.0, such as Java 11 runtime support, protobuf upgrade to 3.7.1, scheduling of opportunistic containers, non-volatile SCM support in HDFS cache directives, etc.

open_in_new Hadoop

local_offer hadoop local_offer windows10

visibility 48
thumb_up 0
access_time 8 days ago

Winutils is required when installing Hadoop on Windows environment. Hadoop 3.3.0 winutils I've compiled Hadoop 3.3.0 on Windows 10 using CMake and Visual Studio (MSVC x64). Follow these two steps to download it: ...

open_in_new Hadoop

Install Hadoop 3.3.0 on Windows 10 using WSL

local_offer linux local_offer hadoop local_offer WSL

visibility 80
thumb_up 0
access_time 9 days ago

Hadoop 3.3.0 was released on July 14 2020. It is the first release of Apache Hadoop 3.3 line. There are significant changes compared with Hadoop 3.2.0, such as Java 11 runtime support, protobuf upgrade to 3.7.1, scheduling of opportunistic containers, non-volatile SCM support in HDFS ca...

open_in_new Hadoop

comment Comments (0)

comment Add comment

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

No comments yet.

Kontext Column

Created for everyone to publish data, programming and cloud related articles. Follow three steps to create your columns.


Learn more arrow_forward