arrow_back Install Hadoop 3.2.0 on Windows 10 using Windows Subsystem for Linux (WSL)

comment Comments
Raymond Raymond #238 access_time 5 years ago more_vert

Have you tried the solution I mentioned in the post? I got the same issue when it is first installed but after the following commands, it work. And also make sure you stop and restart hadoop daemons. 

sudo apt-get install ssh

sudo service ssh restart

I'm not expert in network and I don't think the following solution will definitely help as they are all local traffics. There must be some other reasons that you cannot ssh localhost. For example, is port 22 used by your other programs?  Can you also use IPv4 addresses for localhost instead of the IPv6 one?

Can you try to add firewall rule to allow TCP traffic to ssh port 22?

  • Protocol type: TCP
  • Local port: 22
  • Remote port: All Ports
  • Scope: make sure all your local IP addresses are added.
  • Profiles: Private. I'm choosing this one I will only connect to my wSL when connecting to private network. 
format_quote

person joe access_time 5 years ago

I installed ssh and restarted it. Now 'ssh localhost' just says 'Connection closed by ::1 port 22.'

hide_source Anonymous #85 access_time 5 years ago more_vert

ssh localhost

Connection closed by 127.0.0.1 port 22


http://localhost:9870/dfshealth.html#tab-overview not working

hide_source Anonymous #84 access_time 5 years ago more_vert

I installed ssh and restarted it. Now 'ssh localhost' just says 'Connection closed by ::1 port 22.'

Raymond Raymond #237 access_time 5 years ago more_vert
You are welcome! I’m glad it helped.
format_quote

person Rudy Layedra access_time 5 years ago

This was a great and easy to follow post. THANK YOU!

hide_source Anonymous #81 access_time 5 years ago more_vert

This was a great and easy to follow post. THANK YOU!

Raymond Raymond #236 access_time 5 years ago more_vert
Hmm, sudo is actually not required if you install everything in your own folder though the better approach is to install them in other folders using root user (then sudo is required).
format_quote

person Mohammad access_time 5 years ago

I get Permission Denied when trying to get hadoop binary. after research I found that I need to use sudo in front of command. So need to use 

sudo wget http://mirrors.....


Thanks for great article!


hide_source Anonymous #83 access_time 5 years ago more_vert

I get Permission Denied when trying to get hadoop binary. after research I found that I need to use sudo in front of command. So need to use 

sudo wget http://mirrors.....


Thanks for great article!


hide_source Anonymous #82 access_time 5 years ago more_vert

In my case, the command: 

sbin/start-dfs.sh 

is executed without errors, but the NameNode is not started and therefore it is not responding on http://localhost:9870.

Executing jps command I can see how running processes are:

1) SecondaryNameNode

2) DataNode

3) Jps

NameNode process is missing from the returned list.


Any idea on what can be going wrong?

I followed all the instructions in this guide to configure my WSL environment.


Thanks 

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts