Install Hadoop 3.3.0 on Windows 10 Step by Step Guide
- References
- Required tools
- Step 1 - Download Hadoop binary package
- Select download mirror link
- Download the package
- Step 2 - Unpack the package
- Step 3 - Install Hadoop native IO binary
- Step 4 - (Optional) Java JDK installation
- Step 5 - Configure environment variables
- Configure JAVA_HOME environment variable
- Configure HADOOP_HOME environment variable
- Configure PATH environment variable
- Step 6 - Configure Hadoop
- Configure core site
- Configure HDFS
- Configure MapReduce and YARN site
- Step 7 - Initialise HDFS & bug fix
- Step 8 - Start HDFS daemons
- Step 9 - Start YARN daemons
- Step 10 - Verify Java processes
- Step 11 - Shutdown YARN & HDFS daemons
This detailed step-by-step guide shows you how to install the latest Hadoop v3.3.0 on Windows 10. It leverages Hadoop 3.3.0 winutils tool. WLS (Windows Subsystem for Linux) is not required. This version was released on July 14 2020. It is the first release of Apache Hadoop 3.3 line. There are significant changes compared with Hadoop 3.2.0, such as Java 11 runtime support, protobuf upgrade to 3.7.1, scheduling of opportunistic containers, non-volatile SCM support in HDFS cache directives, etc.
Please follow all the instructions carefully. Once you complete the steps, you will have a shiny pseudo-distributed single node Hadoop to work with.
The yellow elephant logo is a registered trademark of Apache Hadoop; the blue window logo is registered trademark of Microsoft.
References
Refer to the following articles if you prefer to install other versions of Hadoop or if you want to configure a multi-node cluster or using WSL.
- Install Hadoop 3.3.0 on Windows 10 using WSL (Windows Subsystems for Linux is requried)
- Install Hadoop 3.0.0 on Windows (Single Node)
- Configure Hadoop 3.1.0 in a Multi Node Cluster
- Install Hadoop 3.2.0 on Windows 10 using Windows Subsystem for Linux (WSL)
Required tools
Before you start, make sure you have these following tools enabled in Windows 10.
Tool | Comments |
PowerShell | We will use this tool to download package. In my system, PowerShell version table is listed below: $PSversionTable Name Value ---- ----- PSVersion 5.1.19041.1 PSEdition Desktop PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...} BuildVersion 10.0.19041.1 CLRVersion 4.0.30319.42000 WSManStackVersion 3.0 PSRemotingProtocolVersion 2.3 SerializationVersion 1.1.0.1 |
Git Bash or 7 Zip | We will use Git Bash or 7 Zip to unzip Hadoop binary package. You can choose to install either tool or any other tool as long as it can unzip *.tar.gz files on Windows. |
Command Prompt | We will use it to start Hadoop daemons and run some commands as part of the installation process. |
Java JDK | JDK is required to run Hadoop as the framework is built using Java. In my system, my JDK version is jdk1.8.0_161. Check out the supported JDK version on the following page. https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions From Hadoop 3.3.0, Java 11 runtime is now supported. |
Now we will start the installation process.
Step 1 - Download Hadoop binary package
Select download mirror link
Go to download page of the official website:
Apache Download Mirrors - Hadoop 3.3.0
And then choose one of the mirror link. The page lists the mirrors closest to you based on your location. For me, I am choosing the following mirror link:
http://apache.mirror.amaze.com.au/hadoop/common/hadoop-3.3.0/hadoop-3.3.0.tar.gz
Download the package
Open PowerShell and then run the following command lines one by one:
$dest_dir="F:\big-data" $url = "http://apache.mirror.amaze.com.au/hadoop/common/hadoop-3.3.0/hadoop-3.3.0.tar.gz" $client = new-object System.Net.WebClient $client.DownloadFile($url,$dest_dir+"\hadoop-3.3.0.tar.gz")
It may take a few minutes to download.
Once the download completes, you can verify it:
PS F:\big-data> cd $dest_dir PS F:\big-data> ls Directory: F:\big-data Mode LastWriteTime Length Name ---- ------------- ------ ---- -a---- 2/08/2020 1:55 AM 500749234 hadoop-3.3.0.tar.gz
You can also directly download the package through your web browser and save it to the destination directory.
Step 2 - Unpack the package
Now we need to unpack the downloaded package using GUI tool (like 7 Zip) or command line. For me, I will use git bash to unpack it.
Open git bash and change the directory to the destination folder:
cd $dest_dir
And then run the following command to unzip:
tar -xvzf hadoop-3.3.0.tar.gz
The command will take quite a few minutes as there are numerous files included and the latest version introduced many new features.
After the unzip command is completed, a new folder hadoop-3.3.0 is created under the destination folder.
tar: Exiting with failure status due to previous errorsPlease ignore it for now.
Step 3 - Install Hadoop native IO binary
Hadoop on Linux includes optional Native IO support. However Native IO is mandatory on Windows and without it you will not be able to get your installation working. The Windows native IO libraries are not included as part of Apache Hadoop release. Thus we need to build and install it.
https://github.com/kontext-tech/winutils
Download all the files in the following location and save them to the bin folder under Hadoop folder. For my environment, the full path is: F:\big-data\hadoop-3.3.0\bin. Remember to change it to your own path accordingly.
https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin
Alternatively, you can run the following commands in the previous PowerShell window to download:
$client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/hadoop.dll",$dest_dir+"\hadoop-3.3.0\bin\"+"hadoop.dll") $client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/hadoop.exp",$dest_dir+"\hadoop-3.3.0\bin\"+"hadoop.exp") $client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/hadoop.lib",$dest_dir+"\hadoop-3.3.0\bin\"+"hadoop.lib") $client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/hadoop.pdb",$dest_dir+"\hadoop-3.3.0\bin\"+"hadoop.pdb") $client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/libwinutils.lib",$dest_dir+"\hadoop-3.3.0\bin\"+"libwinutils.lib") $client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/winutils.exe",$dest_dir+"\hadoop-3.3.0\bin\"+"winutils.exe") $client.DownloadFile("https://github.com/kontext-tech/winutils/tree/master/hadoop-3.3.0/bin/winutils.pdb",$dest_dir+"\hadoop-3.3.0\bin\"+"winutils.pdb")
After this, the bin folder looks like the following:
Step 4 - (Optional) Java JDK installation
Java JDK is required to run Hadoop. If you have not installed Java JDK, please install it.
You can install JDK 8 from the following page:
https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
Once you complete the installation, please run the following command in PowerShell or Git Bash to verify:
$ java -version java version "1.8.0_161" Java(TM) SE Runtime Environment (build 1.8.0_161-b12) Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
If you got error about 'cannot find java command or executable'. Don't worry we will resolve this in the following step.
Step 5 - Configure environment variables
Now we've downloaded and unpacked all the artefacts we need to configure two important environment variables.
Configure JAVA_HOME environment variable
As mentioned earlier, Hadoop requires Java and we need to configure JAVA_HOME environment variable (though it is not mandatory but I recommend it).
First, we need to find out the location of Java SDK. In my system, the path is: D:\Java\jdk1.8.0_161.
Your location can be different depends on where you install your JDK.
And then run the following command in the previous PowerShell window:
SETX JAVA_HOME "D:\Java\jdk1.8.0_161"
Remember to quote the path especially if you have spaces in your JDK path.
The output looks like the following:
Configure HADOOP_HOME environment variable
Similarly we need to create a new environment variable for HADOOP_HOME using the following command. The path should be your extracted Hadoop folder. For my environment it is: F:\big-data\hadoop-3.3.0.
If you used PowerShell to download and if the window is still open, you can simply run the following command:
SETX HADOOP_HOME $dest_dir+"/hadoop-3.3.0"
The output looks like the following screenshot:
Alternatively, you can specify the full path:
SETX HADOOP_HOME "F:\big-data\hadoop-3.3.0"
Now you can also verify the two environment variables in the system:
Configure PATH environment variable
Once we finish setting up the above two environment variables, we need to add the bin folders to the PATH environment variable.
If PATH environment exists in your system, you can also manually add the following two paths to it:
- %JAVA_HOME%/bin
- %HADOOP_HOME%/bin
Alternatively, you can run the following command to add them:
setx PATH "$env:PATH;$env:JAVA_HOME/bin;$env:HADOOP_HOME/bin"
If you don't have other user variables setup in the system, you can also directly add a Path environment variable that references others to make it short:
Close PowerShell window and open a new one and type winutils.exe directly to verify that our above steps are completed successfully:
You should also be able to run the following command:
hadoop -version java version "1.8.0_161" Java(TM) SE Runtime Environment (build 1.8.0_161-b12) Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
Step 6 - Configure Hadoop
Now we are ready to configure the most important part - Hadoop configurations which involves Core, YARN, MapReduce, HDFS configurations.
Configure core site
Edit file core-site.xml in %HADOOP_HOME%\etc\hadoop folder. For my environment, the actual path is F:\big-data\hadoop-3.3.0\etc\hadoop.
Replace configuration element with the following:
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://0.0.0.0:19000</value>
</property> </configuration>
Configure HDFS
Edit file hdfs-site.xml in %HADOOP_HOME%\etc\hadoop folder.
Before editing, please correct two folders in your system: one for namenode directory and another for data directory. For my system, I created the following two sub folders:
- F:\big-data\data\dfs\namespace_logs_330
- F:\big-data\data\dfs\data_330
Replace configuration element with the following (remember to replace the highlighted paths accordingly):
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:///F:/big-data/data/dfs/namespace_logs_330</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:///F:/big-data/data/dfs/data_330</value>
</property> </configuration>
In Hadoop 3, the property names are slightly different from previous version. Refer to the following official documentation to learn more about the configuration properties:
Configure MapReduce and YARN site
Edit file mapred-site.xml in %HADOOP_HOME%\etc\hadoop folder.
Replace configuration element with the following:
<configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> <property> <name>mapreduce.application.classpath</name> <value>%HADOOP_HOME%/share/hadoop/mapreduce/*,%HADOOP_HOME%/share/hadoop/mapreduce/lib/*,%HADOOP_HOME%/share/hadoop/common/*,%HADOOP_HOME%/share/hadoop/common/lib/*,%HADOOP_HOME%/share/hadoop/yarn/*,%HADOOP_HOME%/share/hadoop/yarn/lib/*,%HADOOP_HOME%/share/hadoop/hdfs/*,%HADOOP_HOME%/share/hadoop/hdfs/lib/*</value> </property> </configuration>
Edit file yarn-site.xml in %HADOOP_HOME%\etc\hadoop folder.
<configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.nodemanager.env-whitelist</name> <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value> </property> </configuration>
Step 7 - Initialise HDFS & bug fix
Run the following command in Command Prompt
hdfs namenode -format
The following is an example when it is formatted successfully:
Step 8 - Start HDFS daemons
Run the following command to start HDFS daemons in Command Prompt:
%HADOOP_HOME%\sbin\start-dfs.cmdTwo Command Prompt windows will open: one for datanode and another for namenode as the following screenshot shows:
Verify HDFS web portal UI through this link: http://localhost:9870/dfshealth.html#tab-overview.
You can also navigate to a data node UI:
Step 9 - Start YARN daemons
Alternatively, you can follow this comment on this page which doesn't require Administrator permission using a local Windows account:
https://kontext.tech/article/377/latest-hadoop-321-installation-on-windows-10-step-by-step-guide#comment314
Run the following command in an elevated Command Prompt window (Run as administrator) to start YARN daemons:
%HADOOP_HOME%\sbin\start-yarn.cmdSimilarly two Command Prompt windows will open: one for resource manager and another for node manager as the following screenshot shows:
You can verify YARN resource manager UI when all services are started successfully.
Step 10 - Verify Java processes
Run the following command to verify all running processes:
jps
The output looks like the following screenshot:
* We can see the process ID of each Java process for HDFS/YARN.
Step 11 - Shutdown YARN & HDFS daemons
You don't need to keep the services running all the time. You can stop them by running the following commands one by one once you finish the test:
%HADOOP_HOME%\sbin\stop-yarn.cmd
%HADOOP_HOME%\sbin\stop-dfs.cmd
Let me know if you encounter any issues. Enjoy with your latest Hadoop on Windows 10.
Can you provide your system information here? The one I compiled is for x64 systems.
I'm facing the same issue, do you want my powershell version is
Name Value
---- -----
PSVersion 5.1.19041.1023
PSEdition Desktop
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.19041.1023
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
I need your system information:
OS Name: Microsoft Windows 10 Pro OS Version: 10.0.19043 N/A Build 19043 OS Manufacturer: Microsoft Corporation OS Configuration: Standalone Workstation OS Build Type: Multiprocessor Free ... System Type: x64-based PC Processor(s): 1 Processor(s) Installed. [01]: Intel64 Family 6 Model 94 Stepping 3 GenuineIntel ~2601 Mhz
You can get it from systeminfo command.
Hi,
Thanks a lot for this detailed installation guide!
I am getting this error in the yarn nodemanager and I am not sure how to fix it:
ERROR util.SysInfoWindows: ExitCodeException exitCode=1:
PdhAddCounter \Network Interface(*)\Bytes Received/Sec failed with
0xc0000bb8.
Error in GetDiskAndNetwork. Err:1
There is a suggested sultion here but I do not know how to apply it:
https://github.com/apache/hadoop/pull/458
Any help would be appreciated
Hi, for the winutils I built for Hadoop 3.3.0, I didn't face any issue like that.
Can you try if the ones I published work for you?
If you want to build winutils by yourself, you need to complete these steps:
- You can follow the commit in the PR to apply changes to your local Hadoop 3.3.0 repo: YARN-8246 winutils - fix failure to retrieve disk and network perf co… by pgoron · Pull Request #458 · apache/hadoop (github.com)
- Refer to Compile and Build Hadoop 3.2.1 on Windows 10 Guide about how to build Hadoop on Windows 10. You need to change some of the dependencies (tools/frameworks) to the Hadoop 3.3.0 ones based on the Build.txt requirements of 3.3.0 release.
HI did you find a solution i have the same issu..
It is confirmed by Naseemuddin that the special build I did for non-English Windows 10 system is working for him:
Re: Install Hadoop 3.3.0 on Windows 10 Step by Step Guide - Kontext
winutils/hadoop-3.3.0-YARN-8246/bin at master · kontext-tech/winutils · GitHub
Hi Raymond,
I have tried to follow your advice and build Hadoop according to your instructions. All I changed for 3.3.0 is to use Protocol Buffers 3.7.1 and of course check out branch rel/release-3.3.0. Unfortunately in the build process I am getting this error:
Can you help me with this?
Hi Naseemuddin,
I need to see the detailed error before I can suggest anything.
Can you run your maven command with option -X and paste the detailed error messages here? The detailed error message shows before the summary section.
Hi Raymond,
the last message before the summary is this
[DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false]
/bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory
Do you think this is the main error? Why is the directory/file missing here?
Appreciate your help!
From the logs you provided, you will at least encounter several errors:
- JAVA_HOME path has space. I recommend to install JDK in a path without space, for example: C:\Java
- C:\Users\NaseemuddinKhan\.m2\repository: Maven home path is too long; please create symbolic link as I mentioned in the other article about building Hadoop. This will lead to a very long file path for some Java libraries.
- Windows SDK version is lower than expected.
For some reason, the error message your provided still doesn't include the error messages.
Can you double confirm that you are using Command Prompt to run the maven build command?
Can you please ensure you follow exactly the steps I mentioned in the build article?
https://kontext.tech/column/hadoop/378/compile-and-build-hadoop-321-on-windows-10-guide
And also you only need to paste the detailed error messages before the summary:
Here is the debug information but doesn't seem to be the error message.
[DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false]
/bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.0:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.828 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 6.380 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 3.688 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.993 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 1.303 s]
[INFO] Apache Hadoop Project Dist POM ..................... FAILURE [ 1.884 s]
[INFO] Apache Hadoop Maven Plugins ........................ SKIPPED
I think a I was able to correct the path to Windws SDK 8.1. I opened the file C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\VCVarsQueryRegistry.bat and adjusted the function GetWindowsSdkDirHelper
:GetWindowsSdkDirHelper
@for /F "tokens=1,2*" %%i in ('reg query "%1\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.1" /v "InstallationFolder"') DO (
@if "%%i"=="InstallationFolder" (
@SET "WindowsSdkDir=%%k"
)
)
@if "%WindowsSdkDir%"=="" exit /B 1
@exit /B 0
I have just changed v7.1A to v8.1 here. In the build log it shows now:
[DEBUG] env: VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\
[DEBUG] env: WDIR=c:\
[DEBUG] env: WINDIR=C:\Windows
[DEBUG] env: WINDOWSSDKDIR=C:\Program Files (x86)\Windows Kits\8.1\
[DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false]
/bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.0:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.270 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 3.086 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.513 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.956 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.534 s]
[INFO] Apache Hadoop Project Dist POM ..................... FAILURE [ 0.812 s]
[INFO] Apache Hadoop Maven Plugins ........................ SKIPPED
[INFO] Apache Hadoop MiniKDC .............................. SKIPPED
[INFO] Apache Hadoop Auth ................................. SKIPPED
Is WINDOWSSDKDIR here correct now? This is the content of that path:
It did not change much else...
Hello,
I've tried my best, I could only build Hadoop 3.3.0 to the HDFS Native Client step so far (not, winutils was built in the steps before this):
[INFO] Reactor Summary for Apache Hadoop Main 3.3.0: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 2.203 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 3.281 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.297 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.375 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 1.219 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.360 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.453 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 1.546 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [ 7.265 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.532 s] [INFO] Apache Hadoop Common ............................... SUCCESS [04:23 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 9.375 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [ 8.283 s] [INFO] Apache Hadoop Registry ............................. SUCCESS [ 11.093 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.515 s] [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 52.032 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [01:46 min] [INFO] Apache Hadoop HDFS Native Client ................... FAILURE [01:03 min]
This is also the reason I have not published article about building Hadoop 3.3.0 on Windows 10. The build article I published previously are for Hadoop 3.2.1 for which I can complete a successful build. I found that it becomes more and more difficult to build successfully on Windows.
Thus I would not recommend spending much more time on this unless it is really important. Build it on Linux or Windows Subsystem for Linux will be much easier. Alternatively you can build Hadoop 3.2.1 instead.
For the step where you failed, I didn't get any error:
[DEBUG] env: =C:=C:\hdp\hadoop [DEBUG] env: =EXITCODE=00000000 [DEBUG] env: ALLUSERSPROFILE=C:\ProgramData [DEBUG] env: APPDATA=C:\Users\kontext\AppData\Roaming [DEBUG] env: CLASSWORLDS_JAR="C:\maven\apache-maven-3.6.3\bin\..\boot\plexus-classworlds-2.6.0.jar" [DEBUG] env: CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher [DEBUG] env: CMAKE_GENERATOR=Visual Studio 14 2015 [DEBUG] env: CMAKE_GENERATOR_PLATFORM=x64 [DEBUG] env: CMAKE_GENERATOR_TOOLSET=v140 [DEBUG] env: COMMONPROGRAMFILES=C:\Program Files\Common Files [DEBUG] env: COMMONPROGRAMFILES(X86)=C:\Program Files (x86)\Common Files [DEBUG] env: COMMONPROGRAMW6432=C:\Program Files\Common Files [DEBUG] env: COMPUTERNAME=RAYMOND-VM [DEBUG] env: COMSPEC=C:\WINDOWS\system32\cmd.exe [DEBUG] env: DEVENVDIR=C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\ [DEBUG] env: DRIVERDATA=C:\Windows\System32\Drivers\DriverData [DEBUG] env: ERROR_CODE=0 [DEBUG] env: EXEC_DIR=C:\hdp\hadoop [DEBUG] env: FRAMEWORK40VERSION=v4.0 [DEBUG] env: FRAMEWORKDIR=C:\WINDOWS\Microsoft.NET\Framework\ [DEBUG] env: FRAMEWORKDIR32=C:\WINDOWS\Microsoft.NET\Framework\ [DEBUG] env: FRAMEWORKVERSION=v4.0.30319 [DEBUG] env: FRAMEWORKVERSION32=v4.0.30319 [DEBUG] env: GIT_HOME=C:\Program Files\Git [DEBUG] env: HOMEDRIVE=C: [DEBUG] env: HOMEPATH=\Users\kontext [DEBUG] env: INCLUDE=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE;C:\Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt;C:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\include\um;C:\Program Files (x86)\Windows Kits\8.1\include\\shared;C:\Program Files (x86)\Windows Kits\8.1\include\\um;C:\Program Files (x86)\Windows Kits\8.1\include\\winrt; [DEBUG] env: JAVACMD=C:\Program Files\Java\jdk1.8.0_261\bin\java.exe [DEBUG] env: JAVA_HOME=C:\Program Files\Java\jdk1.8.0_261 [DEBUG] env: JVMCONFIG=\.mvn\jvm.config [DEBUG] env: LIB=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\LIB;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\LIB;C:\Program Files (x86)\Windows Kits\10\lib\10.0.10240.0\ucrt\x86;C:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\lib\um\x86;C:\Program Files (x86)\Windows Kits\8.1\lib\winv6.3\um\x86; [DEBUG] env: LIBPATH=C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\LIB;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\LIB;C:\Program Files (x86)\Windows Kits\8.1\References\CommonConfiguration\Neutral;\Microsoft.VCLibs\14.0\References\CommonConfiguration\neutral; [DEBUG] env: LOCALAPPDATA=C:\Users\kontext\AppData\Local [DEBUG] env: LOGONSERVER=\\RAYMOND-VM [DEBUG] env: MAVEN_CMD_LINE_ARGS=package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true -X [DEBUG] env: MAVEN_HOME=C:\maven\apache-maven-3.6.3\bin\.. [DEBUG] env: MAVEN_PROJECTBASEDIR=C:\hdp\hadoop [DEBUG] env: MSVS=C:\Program Files (x86)\Microsoft Visual Studio 10.0 [DEBUG] env: NETFXSDKDIR=C:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\ [DEBUG] env: NUMBER_OF_PROCESSORS=2 [DEBUG] env: ONEDRIVE=C:\Users\kontext\OneDrive [DEBUG] env: OS=Windows_NT [DEBUG] env: PATH=C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow;C:\Program Files (x86)\MSBuild\14.0\bin;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\Tools;C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319;C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\VCPackages;C:\Program Files (x86)\HTML Help Workshop;C:\Program Files (x86)\Microsoft Visual Studio 14.0\Team Tools\Performance Tools;C:\Program Files (x86)\Windows Kits\8.1\bin\x86;C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6.1 Tools\;C:\Python\Python38\Scripts\;C:\Python\Python38\;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program Files\Git\cmd;C:\WINDOWS\System32\OpenSSH\;C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit\;C:\Program Files\Microsoft SQL Server\110\Tools\Binn\;C:\Program Files\CMake\bin;C:\Users\kontext\AppData\Local\Microsoft\WindowsApps;C:\Program Files\Java\jdk1.8.0_261\bin;C:\Program Files\Git\bin;C:\Program Files\Git\usr\bin;C:\maven\apache-maven-3.6.3\bin;C:\dev\vcpkg\installed\x64-windows\tools\protobuf; [DEBUG] env: PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY;.PYW [DEBUG] env: PLATFORM=x64 [DEBUG] env: PROCESSOR_ARCHITECTURE=AMD64 [DEBUG] env: PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 94 Stepping 3, GenuineIntel [DEBUG] env: PROCESSOR_LEVEL=6 [DEBUG] env: PROCESSOR_REVISION=5e03 [DEBUG] env: PROGRAMDATA=C:\ProgramData [DEBUG] env: PROGRAMFILES=C:\Program Files [DEBUG] env: PROGRAMFILES(X86)=C:\Program Files (x86) [DEBUG] env: PROGRAMW6432=C:\Program Files [DEBUG] env: PROMPT=$P$G [DEBUG] env: PSMODULEPATH=C:\WINDOWS\system32\WindowsPowerShell\v1.0\Modules\ [DEBUG] env: PUBLIC=C:\Users\Public [DEBUG] env: SESSIONNAME=Console [DEBUG] env: SYSTEMDRIVE=C: [DEBUG] env: SYSTEMROOT=C:\WINDOWS [DEBUG] env: TEMP=C:\Users\kontext\AppData\Local\Temp [DEBUG] env: TMP=C:\Users\kontext\AppData\Local\Temp [DEBUG] env: UCRTVERSION=10.0.10240.0 [DEBUG] env: UNIVERSALCRTSDKDIR=C:\Program Files (x86)\Windows Kits\10\ [DEBUG] env: USERDOMAIN=RAYMOND-VM [DEBUG] env: USERDOMAIN_ROAMINGPROFILE=RAYMOND-VM [DEBUG] env: USERNAME=kontext [DEBUG] env: USERPROFILE=C:\Users\kontext [DEBUG] env: VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ [DEBUG] env: VCVARSPLAT=amd64 [DEBUG] env: VISUALSTUDIOVERSION=14.0 [DEBUG] env: VS100COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\ [DEBUG] env: VS110COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\Tools\ [DEBUG] env: VS120COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools\ [DEBUG] env: VS140COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\Tools\ [DEBUG] env: VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 14.0\ [DEBUG] env: WDIR=C:\ [DEBUG] env: WINDIR=C:\WINDOWS [DEBUG] env: WINDOWSLIBPATH=C:\Program Files (x86)\Windows Kits\8.1\References\CommonConfiguration\Neutral [DEBUG] env: WINDOWSSDKDIR=C:\Program Files (x86)\Windows Kits\8.1\ [DEBUG] env: WINDOWSSDKLIBVERSION=winv6.3\ [DEBUG] env: WINDOWSSDKVERSION=\ [DEBUG] env: WINDOWSSDK_EXECUTABLEPATH_X64=C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6.1 Tools\x64\ [DEBUG] env: WINDOWSSDK_EXECUTABLEPATH_X86=C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6.1 Tools\ [DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false] [INFO] [INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-project-dist --- [DEBUG] Configuring mojo org.apache.maven.plugins:maven-antrun-plugin:1.7:run from plugin realm ClassRealm[plugin>org.apache.maven.plugins:maven-antrun-plugin:1.7-2093631164, parent: sun.misc.Launcher$AppClassLoader@4e25154f] [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-antrun-plugin:1.7:run' with basic configurator --> [DEBUG] (f) exportAntProperties = false [DEBUG] (f) failOnError = true [DEBUG] (f) localRepository = id: local url: file:///C:/m-repo/ layout: default snapshots: [enabled => true, update => always] releases: [enabled => true, update => always] [DEBUG] (f) pluginArtifacts = [org.apache.maven.plugins:maven-antrun-plugin:maven-plugin:1.7:, org.apache.maven:maven-plugin-api:jar:2.0.11:compile, org.apache.maven:maven-project:jar:2.0.11:compile, org.apache.maven:maven-settings:jar:2.0.11:compile, org.apache.maven:maven-profile:jar:2.0.11:compile, org.apache.maven:maven-model:jar:2.0.11:compile, org.apache.maven:maven-artifact-manager:jar:2.0.11:compile, org.apache.maven:maven-repository-metadata:jar:2.0.11:compile, org.apache.maven:maven-plugin-registry:jar:2.0.11:compile, org.codehaus.plexus:plexus-interpolation:jar:1.1:compile, org.codehaus.plexus:plexus-container-default:jar:1.0-alpha-9-stable-1:compile, junit:junit:jar:3.8.1:compile, classworlds:classworlds:jar:1.1-alpha-2:compile, org.apache.maven:maven-artifact:jar:2.0.11:compile, org.codehaus.plexus:plexus-utils:jar:2.0.5:compile, org.apache.ant:ant:jar:1.8.2:compile, org.apache.ant:ant-launcher:jar:1.8.2:compile] [DEBUG] (f) project = MavenProject: org.apache.hadoop:hadoop-project-dist:3.3.0 @ C:\hdp\hadoop\hadoop-project-dist\pom.xml [DEBUG] (f) skip = false [DEBUG] (f) target = <target if="tar"><echo file="C:\hdp\hadoop\hadoop-project-dist\target/dist-maketar.sh">cd "C:\hdp\hadoop\hadoop-project-dist\target" tar cf - hadoop-project-dist-3.3.0 | gzip > hadoop-project-dist-3.3.0.tar.gz</echo> <exec failonerror="true" dir="C:\hdp\hadoop\hadoop-project-dist\target" executable="bash"><arg line="./dist-maketar.sh"/> </exec> </target>
I feel like there might be something different between your bash and mine. For example, based the following error message in your log, your bash is removing '\' from the path which lead to the problem. Unfortunately I don't know what could be the root cause:
/bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory
My bash version is:
bash --version GNU bash, version 4.4.23(1)-release (x86_64-pc-msys) Copyright (C) 2016 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software; you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law.
Hi Raymond, you have gotten farther than I have. I run into failures before the winutils is created. Could create wintuils for me with one little change in the code, described here:
https://github.com/apache/hadoop/pull/458
That is all I need, and why I am trying to do this ... whitout success.
I would really appreciate it!
I've checked out Hadoop 3.3.0 source code and also manually applied the PR you mentioned to update systeminfo.c file to replace PdhAddCounter with PdhAddEnglishCounter.
The build winutils tool are published here:
winutils/hadoop-3.3.0-YARN-8246/bin at master · kontext-tech/winutils · GitHub
Please give it a try to see if it works.
Hi Raymond,
thanks a lot for your help!
- I have removed and installed Java to a C:\Java
- I had created the symbolic link for Maven but forgot to add it into settings.xml. Now I have added it and rerun the command. I am not sure if the symbolic link is working correctly. At least it works with cd command:
See here for my settings.xml:
However, in the logs (below) it still appears as full path. What could be wrong?
- I have installed Windows SKD 8.1 and you can find the path C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit\ in some places the logs (below). However, the problems seems to be that WINDOWSSDKDIR is pointing to the wrong path (C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\). This variable is not listed in the systems variables like the others. Do you know how to change it? I have tried this but without success: https://stackoverflow.com/questions/3599079/windowssdkdir-is-not-set-correctly-in-visual-studio-2010
- I am using the Visual Studio x64 Win64 Command Prompt
as instructed in your guide:
- I am quite sure that followed the steps in the build article exactly. I just forgot the settings.xml. The only difference is that I am using Protocol Buffers 3.7.1 for Hadoop 3.3.0
- The final debug message before the summary states that C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs is missing. It is not there in the GitHub repository. Is it required?
[DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false]
/bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.0:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.138 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 2.419 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.093 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.407 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.410 s]
[INFO] Apache Hadoop Project Dist POM ..................... FAILURE [ 0.704 s]
[INFO] Apache Hadoop Maven Plugins ........................ SKIPPED
- Here is the complete log
[DEBUG] Configuring mojo 'org.codehaus.mojo:exec-maven-plugin:1.3.1:exec' with basic configurator -->
[DEBUG] (f) arguments = [C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false]
[DEBUG] (f) basedir = C:\hdp\hadoop\hadoop-project-dist
[DEBUG] (f) classpathScope = runtime
[DEBUG] (f) executable = bash
[DEBUG] (f) failWithEmptyArgument = true
[DEBUG] (f) failWithNullKeyOrValueInEnvironment = true
[DEBUG] (f) longClasspath = false
[DEBUG] (f) project = MavenProject: org.apache.hadoop:hadoop-project-dist:3.3.0 @ C:\hdp\hadoop\hadoop-project-dist\pom.xml
[DEBUG] (f) skip = false
[DEBUG] (f) workingDirectory = C:\hdp\hadoop\hadoop-project-dist\target
[DEBUG] (f) session = org.apache.maven.execution.MavenSession@5edf2821
[DEBUG] -- end configuration --
[DEBUG] env: =C:=C:\hdp\hadoop
[DEBUG] env: =EXITCODE=00000000
[DEBUG] env: ALLUSERSPROFILE=C:\ProgramData
[DEBUG] env: APPDATA=C:\Users\NaseemuddinKhan\AppData\Roaming
[DEBUG] env: CLASSWORLDS_JAR="C:\apache-maven-3.6.3\bin\..\boot\plexus-classworlds-2.6.0.jar"
[DEBUG] env: CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher
[DEBUG] env: COMMANDPROMPTTYPE=Native
[DEBUG] env: COMMONPROGRAMFILES=C:\Program Files\Common Files
[DEBUG] env: COMMONPROGRAMFILES(X86)=C:\Program Files (x86)\Common Files
[DEBUG] env: COMMONPROGRAMW6432=C:\Program Files\Common Files
[DEBUG] env: COMPUTERNAME=ADV075
[DEBUG] env: COMSPEC=C:\Windows\system32\cmd.exe
[DEBUG] env: DRIVERDATA=C:\Windows\System32\Drivers\DriverData
[DEBUG] env: ERROR_CODE=0
[DEBUG] env: EXEC_DIR=C:\hdp\hadoop
[DEBUG] env: FRAMEWORK35VERSION=v3.5
[DEBUG] env: FRAMEWORKDIR=C:\Windows\Microsoft.NET\Framework64
[DEBUG] env: FRAMEWORKDIR64=C:\Windows\Microsoft.NET\Framework64
[DEBUG] env: FRAMEWORKVERSION=v4.0.30319
[DEBUG] env: FRAMEWORKVERSION64=v4.0.30319
[DEBUG] env: GIT_HOME=C:\Program Files\Git
[DEBUG] env: HADOOP_HOME=C:\hadoop-3.3.0
[DEBUG] env: HIVE_HOME=C:\apache-hive-3.1.2-bin
[DEBUG] env: HOMEDRIVE=C:
[DEBUG] env: HOMEPATH=\Users\NaseemuddinKhan
[DEBUG] env: INCLUDE=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\INCLUDE;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\INCLUDE;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\include;
[DEBUG] env: JAVACMD=C:\Java\jdk1.8.0_281\bin\java.exe
[DEBUG] env: JAVA_HOME=C:\Java\jdk1.8.0_281
[DEBUG] env: JVMCONFIG=\.mvn\jvm.config
[DEBUG] env: LIB=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\LIB\amd64;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\LIB\amd64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\lib\x64;
[DEBUG] env: LIBPATH=C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Windows\Microsoft.NET\Framework64\v3.5;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\LIB\amd64;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\LIB\amd64;
[DEBUG] env: LOCALAPPDATA=C:\Users\NaseemuddinKhan\AppData\Local
[DEBUG] env: LOGONSERVER=\\ADV075
[DEBUG] env: MAVEN_CMD_LINE_ARGS=package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true -X
[DEBUG] env: MAVEN_HOME=C:\apache-maven-3.6.3\bin\..
[DEBUG] env: MAVEN_PROJECTBASEDIR=C:\hdp\hadoop
[DEBUG] env: MSVS=C:\Program Files (x86)\Microsoft Visual Studio 10.0
[DEBUG] env: NUMBER_OF_PROCESSORS=8
[DEBUG] env: ONEDRIVE=C:\Users\NaseemuddinKhan\OneDrive - ADVISORI FTC GmbH
[DEBUG] env: ONEDRIVECOMMERCIAL=C:\Users\NaseemuddinKhan\OneDrive - ADVISORI FTC GmbH
[DEBUG] env: OS=Windows_NT
[DEBUG] env: PATH=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\BIN\amd64;C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Windows\Microsoft.NET\Framework64\v3.5;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\VCPackages;C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE;C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools;C:\Program Files (x86)\HTML Help Workshop;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\x64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\x64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\Program Files\Python39\Scripts\;C:\Program Files\Python39\;C:\oraclexe\app\oracle\product\11.2.0\server\bin;C:\oraclexe\app\oracle\product\11.2.0\server\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files\Intel\WiFi\bin\;C:\Program Files\Common Files\Intel\WirelessCommon\;C:\Program Files\PuTTY\;C:\Program Files\Docker\Docker\resources\bin;C:\ProgramData\DockerDesktop\version-bin;c:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;c:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Python39;C:\Java\jdk1.8.0_281\bin;C:\spark-3.0.1-bin-hadoop3.2\bin;C:\hadoop-3.3.0\bin;C:\hadoop-3.3.0\sbin;C:\apache-hive-3.1.2-bin\bin;C:\Program Files\Git\cmd;C:\Program Files\Git\bin;C:\Program Files\Git\usr\bin;C:\apache-maven-3.6.3\bin;C:\protoc-2.5.0-win32;C:\Program Files\CMake\bin;C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit\;C:\Users\NaseemuddinKhan\AppData\Local\Microsoft\WindowsApps;C:\Program Files (x86)\Graphviz2.38\bin
[DEBUG] env: PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY;.PYW
[DEBUG] env: PLATFORM=X64
[DEBUG] env: PROCESSOR_ARCHITECTURE=AMD64
[DEBUG] env: PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 142 Stepping 10, GenuineIntel
[DEBUG] env: PROCESSOR_LEVEL=6
[DEBUG] env: PROCESSOR_REVISION=8e0a
[DEBUG] env: PROGRAMDATA=C:\ProgramData
[DEBUG] env: PROGRAMFILES=C:\Program Files
[DEBUG] env: PROGRAMFILES(X86)=C:\Program Files (x86)
[DEBUG] env: PROGRAMW6432=C:\Program Files
[DEBUG] env: PROMPT=$P$G
[DEBUG] env: PSMODULEPATH=C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules
[DEBUG] env: PUBLIC=C:\Users\Public
[DEBUG] env: SPARK_HOME=C:\spark-3.0.1-bin-hadoop3.2
[DEBUG] env: SPARK_LOCAL_IP=127.0.0.1
[DEBUG] env: SYSTEMDRIVE=C:
[DEBUG] env: SYSTEMROOT=C:\Windows
[DEBUG] env: TEMP=C:\Users\NASEEM~1\AppData\Local\Temp
[DEBUG] env: TMP=C:\Users\NASEEM~1\AppData\Local\Temp
[DEBUG] env: USERDOMAIN=AzureAD
[DEBUG] env: USERDOMAIN_ROAMINGPROFILE=AzureAD
[DEBUG] env: USERNAME=NaseemuddinKhan
[DEBUG] env: USERPROFILE=C:\Users\NaseemuddinKhan
[DEBUG] env: VBOX_MSI_INSTALL_PATH=C:\Program Files\Oracle\VirtualBox\
[DEBUG] env: VCINSTALLDIR=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\
[DEBUG] env: VCVARSPLAT=amd64
[DEBUG] env: VS100COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\
[DEBUG] env: VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\
[DEBUG] env: WDIR=C:\
[DEBUG] env: WINDIR=C:\Windows
[DEBUG] env: WINDOWSSDKDIR=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\
[DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false]
/bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.0:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.138 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 2.419 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.093 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.407 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.410 s]
[INFO] Apache Hadoop Project Dist POM ..................... FAILURE [ 0.704 s]
Hi Raymond,
thanks a lot for your reply! Here is the complete output. There are several errors. I hope you can find my mistake
[DEBUG] Configuring mojo 'org.codehaus.mojo:exec-maven-plugin:1.3.1:exec' with basic configurator --> [DEBUG] (f) arguments = [C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false] [DEBUG] (f) basedir = C:\hdp\hadoop\hadoop-project-dist [DEBUG] (f) classpathScope = runtime [DEBUG] (f) executable = bash [DEBUG] (f) failWithEmptyArgument = true [DEBUG] (f) failWithNullKeyOrValueInEnvironment = true [DEBUG] (f) longClasspath = false [DEBUG] (f) project = MavenProject: org.apache.hadoop:hadoop-project-dist:3.3.0 @ C:\hdp\hadoop\hadoop-project-dist\pom.xml [DEBUG] (f) skip = false [DEBUG] (f) workingDirectory = C:\hdp\hadoop\hadoop-project-dist\target [DEBUG] (f) session = org.apache.maven.execution.MavenSession@5edf2821 [DEBUG] -- end configuration -- [DEBUG] env: =C:=C:\hdp\hadoop [DEBUG] env: =EXITCODE=00000000 [DEBUG] env: ALLUSERSPROFILE=C:\ProgramData [DEBUG] env: APPDATA=C:\Users\NaseemuddinKhan\AppData\Roaming [DEBUG] env: CLASSWORLDS_JAR="C:\apache-maven-3.6.3\bin\..\boot\plexus-classworlds-2.6.0.jar" [DEBUG] env: CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher [DEBUG] env: COMMANDPROMPTTYPE=Native [DEBUG] env: COMMONPROGRAMFILES=C:\Program Files\Common Files [DEBUG] env: COMMONPROGRAMFILES(X86)=C:\Program Files (x86)\Common Files [DEBUG] env: COMMONPROGRAMW6432=C:\Program Files\Common Files [DEBUG] env: COMPUTERNAME=ADV075 [DEBUG] env: COMSPEC=C:\Windows\system32\cmd.exe [DEBUG] env: DRIVERDATA=C:\Windows\System32\Drivers\DriverData [DEBUG] env: ERROR_CODE=0 [DEBUG] env: EXEC_DIR=C:\hdp\hadoop [DEBUG] env: FRAMEWORK35VERSION=v3.5 [DEBUG] env: FRAMEWORKDIR=C:\Windows\Microsoft.NET\Framework64 [DEBUG] env: FRAMEWORKDIR64=C:\Windows\Microsoft.NET\Framework64 [DEBUG] env: FRAMEWORKVERSION=v4.0.30319 [DEBUG] env: FRAMEWORKVERSION64=v4.0.30319 [DEBUG] env: GIT_HOME=C:\Program Files\Git [DEBUG] env: HADOOP_HOME=C:\hadoop-3.3.0 [DEBUG] env: HIVE_HOME=C:\apache-hive-3.1.2-bin [DEBUG] env: HOMEDRIVE=C: [DEBUG] env: HOMEPATH=\Users\NaseemuddinKhan [DEBUG] env: INCLUDE=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\INCLUDE;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\INCLUDE;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\include; [DEBUG] env: JAVACMD=C:\Java\jdk1.8.0_281\bin\java.exe [DEBUG] env: JAVA_HOME=C:\Java\jdk1.8.0_281 [DEBUG] env: JVMCONFIG=\.mvn\jvm.config [DEBUG] env: LIB=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\LIB\amd64;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\LIB\amd64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\lib\x64; [DEBUG] env: LIBPATH=C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Windows\Microsoft.NET\Framework64\v3.5;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\LIB\amd64;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\LIB\amd64; [DEBUG] env: LOCALAPPDATA=C:\Users\NaseemuddinKhan\AppData\Local [DEBUG] env: LOGONSERVER=\\ADV075 [DEBUG] env: MAVEN_CMD_LINE_ARGS=package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true -X [DEBUG] env: MAVEN_HOME=C:\apache-maven-3.6.3\bin\.. [DEBUG] env: MAVEN_PROJECTBASEDIR=C:\hdp\hadoop [DEBUG] env: MSVS=C:\Program Files (x86)\Microsoft Visual Studio 10.0 [DEBUG] env: NUMBER_OF_PROCESSORS=8 [DEBUG] env: ONEDRIVE=C:\Users\NaseemuddinKhan\OneDrive - ADVISORI FTC GmbH [DEBUG] env: ONEDRIVECOMMERCIAL=C:\Users\NaseemuddinKhan\OneDrive - ADVISORI FTC GmbH [DEBUG] env: OS=Windows_NT [DEBUG] env: PATH=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\BIN\amd64;C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Windows\Microsoft.NET\Framework64\v3.5;c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\VCPackages;C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE;C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools;C:\Program Files (x86)\HTML Help Workshop;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\x64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\x64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\Program Files\Python39\Scripts\;C:\Program Files\Python39\;C:\oraclexe\app\oracle\product\11.2.0\server\bin;C:\oraclexe\app\oracle\product\11.2.0\server\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files\Intel\WiFi\bin\;C:\Program Files\Common Files\Intel\WirelessCommon\;C:\Program Files\PuTTY\;C:\Program Files\Docker\Docker\resources\bin;C:\ProgramData\DockerDesktop\version-bin;c:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;c:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Python39;C:\Java\jdk1.8.0_281\bin;C:\spark-3.0.1-bin-hadoop3.2\bin;C:\hadoop-3.3.0\bin;C:\hadoop-3.3.0\sbin;C:\apache-hive-3.1.2-bin\bin;C:\Program Files\Git\cmd;C:\Program Files\Git\bin;C:\Program Files\Git\usr\bin;C:\apache-maven-3.6.3\bin;C:\protoc-2.5.0-win32;C:\Program Files\CMake\bin;C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit\;C:\Users\NaseemuddinKhan\AppData\Local\Microsoft\WindowsApps;C:\Program Files (x86)\Graphviz2.38\bin [DEBUG] env: PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY;.PYW [DEBUG] env: PLATFORM=X64 [DEBUG] env: PROCESSOR_ARCHITECTURE=AMD64 [DEBUG] env: PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 142 Stepping 10, GenuineIntel [DEBUG] env: PROCESSOR_LEVEL=6 [DEBUG] env: PROCESSOR_REVISION=8e0a [DEBUG] env: PROGRAMDATA=C:\ProgramData [DEBUG] env: PROGRAMFILES=C:\Program Files [DEBUG] env: PROGRAMFILES(X86)=C:\Program Files (x86) [DEBUG] env: PROGRAMW6432=C:\Program Files [DEBUG] env: PROMPT=$P$G [DEBUG] env: PSMODULEPATH=C:\Program Files\WindowsPowerShell\Modules;C:\Windows\system32\WindowsPowerShell\v1.0\Modules [DEBUG] env: PUBLIC=C:\Users\Public [DEBUG] env: SPARK_HOME=C:\spark-3.0.1-bin-hadoop3.2 [DEBUG] env: SPARK_LOCAL_IP=127.0.0.1 [DEBUG] env: SYSTEMDRIVE=C: [DEBUG] env: SYSTEMROOT=C:\Windows [DEBUG] env: TEMP=C:\Users\NASEEM~1\AppData\Local\Temp [DEBUG] env: TMP=C:\Users\NASEEM~1\AppData\Local\Temp [DEBUG] env: USERDOMAIN=AzureAD [DEBUG] env: USERDOMAIN_ROAMINGPROFILE=AzureAD [DEBUG] env: USERNAME=NaseemuddinKhan [DEBUG] env: USERPROFILE=C:\Users\NaseemuddinKhan [DEBUG] env: VBOX_MSI_INSTALL_PATH=C:\Program Files\Oracle\VirtualBox\ [DEBUG] env: VCINSTALLDIR=c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ [DEBUG] env: VCVARSPLAT=amd64 [DEBUG] env: VS100COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\ [DEBUG] env: VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\ [DEBUG] env: WDIR=C:\ [DEBUG] env: WINDIR=C:\Windows [DEBUG] env: WINDOWSSDKDIR=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\ [DEBUG] Executing command line: [bash, C:\hdp\hadoop\hadoop-project/../dev-support/bin/dist-copynativelibs, --version=3.3.0, --builddir=C:\hdp\hadoop\hadoop-project-dist\target, --artifactid=hadoop-project-dist, --isalbundle=false, --isallib=, --openssllib=, --opensslbinbundle=false, --openssllibbundle=false, --snappybinbundle=false, --snappylib=, --snappylibbundle=false, --zstdbinbundle=false, --zstdlib=, --zstdlibbundle=false] /bin/bash: C:hdphadoophadoop-project/../dev-support/bin/dist-copynativelibs: No such file or directory [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for Apache Hadoop Main 3.3.0: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 1.138 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 2.419 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.093 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.407 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.410 s] [INFO] Apache Hadoop Project Dist POM ..................... FAILURE [ 0.704 s]
Thanks a lot for you reply, but this is a bit to complicated for me. What disatvantages am I going to have if this error remains there?
Namenode is one of the most important daemon services in Hadoop. If it cannot start successfully, you won't be able to perform any other actions as other services rely on it.
Hi,
When ever I try to copy file from pc to hadoop, it throws error.
hadoop fs -put ‪D:/test/file1 /test/input
-put: Can not create a Path from an empty string
also tried --copyFromLocal
hadoop fs -copyFromLocal ‪D:/test/file1 /test/input
-copyFromLocal: Can not create a Path from an empty string
same error.
Please help
Hi Bikash,
Can you please confirm whether you are using Hadoop 3.3.0 installed from this guide?
I've tried the following commands in my Hadoop 3.3.0 installation on Windows 10 and it is all working well:
F:\big-data>hadoop fs -put F:\big-data\test.csv /test/csv.1 F:\big-data>hadoop fs -copyFromLocal F:\big-data\test.csv /test/csv.2 F:\big-data>hadoop fs -ls /test Found 3 items -rw-r--r-- 1 *** supergroup 29 2021-03-06 09:42 /test/csv.1 -rw-r--r-- 1 *** supergroup 29 2021-03-06 09:42 /test/csv.2 -rw-r--r-- 1 *** supergroup 29 2020-08-22 22:46 /test/test.csv
Please make sure you've followed all the steps exactly.
thanks for posting clear instructions, failed at starting the dfs
first line didn't look too encouraging!
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
2021-02-12 13:06:34,694 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = DESKTOP-3UP7473/192.168.93.209
STARTUP_MSG: args = []
STARTUP_MSG: version = 3.3.0
STARTUP_MSG: classpath = D:\big-data\hadoop-3.3.0\etc\hadoop;D:\big-data\hadoop-3.3.0\share\hadoop\common;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\accessors-smart-1.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\animal-sniffer-annotations-1.17.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\asm-5.0.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\audience-annotations-0.5.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\avro-1.7.7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\checker-qual-2.5.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-beanutils-1.9.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-cli-1.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-codec-1.11.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-compress-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-configuration2-2.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-daemon-1.0.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-io-2.5.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-lang3-3.7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-net-3.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\commons-text-1.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\curator-client-4.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\curator-framework-4.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\curator-recipes-4.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\dnsjava-2.1.7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\failureaccess-1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\gson-2.2.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\guava-27.0-jre.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\hadoop-annotations-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\hadoop-auth-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\hadoop-shaded-protobuf_3_7-1.0.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\htrace-core4-4.1.0-incubating.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\httpclient-4.5.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\httpcore-4.4.10.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\j2objc-annotations-1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-annotations-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-core-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-core-asl-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-databind-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-jaxrs-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-mapper-asl-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jackson-xc-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\javax.activation-api-1.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\javax.servlet-api-3.1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jaxb-api-2.2.11.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jaxb-impl-2.2.3-1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jcip-annotations-1.0-1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jersey-core-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jersey-json-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jersey-server-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jersey-servlet-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jettison-1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-http-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-io-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-security-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-server-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-servlet-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-util-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-webapp-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jetty-xml-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jsch-0.1.55.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\json-smart-2.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jsp-api-2.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jsr305-3.0.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jsr311-api-1.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\jul-to-slf4j-1.7.25.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-admin-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-client-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-common-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-core-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-crypto-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-identity-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-server-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-simplekdc-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerb-util-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerby-asn1-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerby-config-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerby-pkix-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerby-util-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\kerby-xdr-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\log4j-1.2.17.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\metrics-core-3.2.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\netty-3.10.6.Final.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\nimbus-jose-jwt-7.9.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\paranamer-2.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\protobuf-java-2.5.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\re2j-1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\slf4j-api-1.7.25.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\slf4j-log4j12-1.7.25.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\snappy-java-1.0.5.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\stax2-api-3.1.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\token-provider-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\woodstox-core-5.0.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\zookeeper-3.5.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\lib\zookeeper-jute-3.5.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\hadoop-common-3.3.0-tests.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\hadoop-common-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\hadoop-kms-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\hadoop-nfs-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\common\hadoop-registry-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\accessors-smart-1.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\animal-sniffer-annotations-1.17.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\asm-5.0.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\audience-annotations-0.5.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\avro-1.7.7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\checker-qual-2.5.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-beanutils-1.9.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-cli-1.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-codec-1.11.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-collections-3.2.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-compress-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-configuration2-2.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-daemon-1.0.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-io-2.5.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-lang3-3.7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-logging-1.1.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-math3-3.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-net-3.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-text-1.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\curator-client-4.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\curator-framework-4.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\curator-recipes-4.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\dnsjava-2.1.7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\failureaccess-1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\gson-2.2.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\guava-27.0-jre.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\hadoop-annotations-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\hadoop-auth-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\hadoop-shaded-protobuf_3_7-1.0.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\htrace-core4-4.1.0-incubating.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\httpclient-4.5.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\httpcore-4.4.10.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\j2objc-annotations-1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-annotations-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-core-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-core-asl-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-databind-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-jaxrs-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-mapper-asl-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-xc-1.9.13.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\javax.activation-api-1.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\javax.servlet-api-3.1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jaxb-api-2.2.11.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jaxb-impl-2.2.3-1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jcip-annotations-1.0-1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-core-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-json-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-server-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-servlet-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jettison-1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-http-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-io-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-security-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-server-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-servlet-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-util-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-util-ajax-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-webapp-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-xml-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jsch-0.1.55.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\json-simple-1.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\json-smart-2.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jsr305-3.0.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\jsr311-api-1.1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-admin-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-client-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-common-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-core-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-crypto-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-identity-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-server-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-simplekdc-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-util-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-asn1-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-config-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-pkix-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-util-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-xdr-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\leveldbjni-all-1.8.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\log4j-1.2.17.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\netty-3.10.6.Final.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\netty-all-4.1.50.Final.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\nimbus-jose-jwt-7.9.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\okhttp-2.7.5.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\okio-1.6.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\paranamer-2.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\protobuf-java-2.5.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\re2j-1.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\snappy-java-1.0.5.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\stax2-api-3.1.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\token-provider-1.0.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\woodstox-core-5.0.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\zookeeper-3.5.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\lib\zookeeper-jute-3.5.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-3.3.0-tests.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-client-3.3.0-tests.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-client-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-httpfs-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-native-client-3.3.0-tests.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-native-client-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-nfs-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-rbf-3.3.0-tests.jar;D:\big-data\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-rbf-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\aopalliance-1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\asm-analysis-7.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\asm-commons-7.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\asm-tree-7.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\bcpkix-jdk15on-1.60.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\bcprov-jdk15on-1.60.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\ehcache-3.3.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\fst-2.50.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\guice-4.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\guice-servlet-4.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\HikariCP-java7-2.4.12.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-jaxrs-base-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-jaxrs-json-provider-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-module-jaxb-annotations-2.10.3.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jakarta.activation-api-1.2.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jakarta.xml.bind-api-2.3.2.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\java-util-1.9.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\javax-websocket-client-impl-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\javax-websocket-server-impl-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\javax.inject-1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\javax.websocket-api-1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\javax.websocket-client-api-1.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jersey-client-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jersey-guice-1.19.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-annotations-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-client-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-jndi-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-plus-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jline-3.9.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\jna-5.2.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\json-io-2.5.1.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\metrics-core-3.2.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\mssql-jdbc-6.2.1.jre7.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\objenesis-2.6.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\snakeyaml-1.16.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\swagger-annotations-1.5.4.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-api-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-client-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-common-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-server-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-servlet-9.4.20.v20190813.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-api-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-distributedshell-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-mawo-core-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-unmanaged-am-launcher-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-client-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-common-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-registry-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-applicationhistoryservice-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-common-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-nodemanager-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-resourcemanager-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-router-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-sharedcachemanager-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-tests-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-timeline-pluginstorage-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-web-proxy-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-services-api-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-services-core-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-app-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-common-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-core-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-plugins-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-3.3.0-tests.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-nativetask-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-uploader-3.3.0.jar;D:\big-data\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-examples-3.3.0.jar
STARTUP_MSG: build = https://gitbox.apache.org/repos/asf/hadoop.git -r aa96f1871bfd858f9bac59cf2a81ec470da649af; compiled by 'brahma' on 2020-07-06T18:44Z
STARTUP_MSG: java = 1.8.0_281
************************************************************/
2021-02-12 13:06:34,768 INFO namenode.NameNode: createNameNode []
2021-02-12 13:06:34,860 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties
2021-02-12 13:06:34,963 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2021-02-12 13:06:34,963 INFO impl.MetricsSystemImpl: NameNode metrics system started
2021-02-12 13:06:35,138 INFO namenode.NameNodeUtils: fs.defaultFS is hdfs://0.0.0.0:19000
2021-02-12 13:06:35,139 INFO namenode.NameNode: Clients should use 0.0.0.0:19000 to access this namenode/service.
2021-02-12 13:06:35,162 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2021-02-12 13:06:35,242 INFO util.JvmPauseMonitor: Starting JVM pause monitor
2021-02-12 13:06:35,257 INFO hdfs.DFSUtil: Filter initializers set : org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
2021-02-12 13:06:35,261 INFO hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:9870
2021-02-12 13:06:35,269 INFO util.log: Logging initialized @887ms to org.eclipse.jetty.util.log.Slf4jLog
2021-02-12 13:06:35,334 INFO server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2021-02-12 13:06:35,456 INFO http.HttpRequestLog: Http request log for http.requests.namenode is not defined
2021-02-12 13:06:35,463 INFO http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2021-02-12 13:06:35,464 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
2021-02-12 13:06:35,464 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2021-02-12 13:06:35,465 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2021-02-12 13:06:35,467 INFO http.HttpServer2: Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context hdfs
2021-02-12 13:06:35,467 INFO http.HttpServer2: Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context logs
2021-02-12 13:06:35,467 INFO http.HttpServer2: Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context static
2021-02-12 13:06:35,485 INFO http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
2021-02-12 13:06:35,491 INFO http.HttpServer2: Jetty bound to port 9870
2021-02-12 13:06:35,492 INFO server.Server: jetty-9.4.20.v20190813; built: 2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 1.8.0_281-b09
2021-02-12 13:06:35,510 INFO server.session: DefaultSessionIdManager workerName=node0
2021-02-12 13:06:35,510 INFO server.session: No SessionScavenger set, using defaults
2021-02-12 13:06:35,511 INFO server.session: node0 Scavenging every 600000ms
2021-02-12 13:06:35,520 INFO server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2021-02-12 13:06:35,522 INFO handler.ContextHandler: Started o.e.j.s.ServletContextHandler@4d1c005e{logs,/logs,file:///D:/big-data/hadoop-3.3.0/logs/,AVAILABLE}
2021-02-12 13:06:35,522 INFO handler.ContextHandler: Started o.e.j.s.ServletContextHandler@59402b8f{static,/static,file:///D:/big-data/hadoop-3.3.0/share/hadoop/hdfs/webapps/static/,AVAILABLE}
2021-02-12 13:06:35,564 INFO util.TypeUtil: JVM Runtime does not support Modules
2021-02-12 13:06:35,570 INFO handler.ContextHandler: Started o.e.j.w.WebAppContext@1130520d{hdfs,/,file:///D:/big-data/hadoop-3.3.0/share/hadoop/hdfs/webapps/hdfs/,AVAILABLE}{file:/D:/big-data/hadoop-3.3.0/share/hadoop/hdfs/webapps/hdfs}
2021-02-12 13:06:35,577 INFO server.AbstractConnector: Started ServerConnector@d554c5f{HTTP/1.1,[http/1.1]}{0.0.0.0:9870}
2021-02-12 13:06:35,577 INFO server.Server: Started @1196ms
2021-02-12 13:06:35,872 WARN namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
2021-02-12 13:06:35,872 WARN namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
2021-02-12 13:06:35,903 INFO namenode.FSEditLog: Edit logging is async:true
2021-02-12 13:06:35,921 INFO namenode.FSNamesystem: KeyProvider: null
2021-02-12 13:06:35,922 INFO namenode.FSNamesystem: fsLock is fair: true
2021-02-12 13:06:35,922 INFO namenode.FSNamesystem: Detailed lock hold time metrics enabled: false
2021-02-12 13:06:35,927 INFO namenode.FSNamesystem: fsOwner = carlr (auth:SIMPLE)
2021-02-12 13:06:35,927 INFO namenode.FSNamesystem: supergroup = supergroup
2021-02-12 13:06:35,927 INFO namenode.FSNamesystem: isPermissionEnabled = true
2021-02-12 13:06:35,927 INFO namenode.FSNamesystem: isStoragePolicyEnabled = true
2021-02-12 13:06:35,928 INFO namenode.FSNamesystem: HA Enabled: false
2021-02-12 13:06:35,952 INFO common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
2021-02-12 13:06:35,957 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit: configured=1000, counted=60, effected=1000
2021-02-12 13:06:35,957 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
2021-02-12 13:06:35,960 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
2021-02-12 13:06:35,960 INFO blockmanagement.BlockManager: The block deletion will start around 2021 Feb 12 13:06:35
2021-02-12 13:06:35,961 INFO util.GSet: Computing capacity for map BlocksMap
2021-02-12 13:06:35,961 INFO util.GSet: VM type = 64-bit
2021-02-12 13:06:35,963 INFO util.GSet: 2.0% max memory 889 MB = 17.8 MB
2021-02-12 13:06:35,963 INFO util.GSet: capacity = 2^21 = 2097152 entries
2021-02-12 13:06:35,968 INFO blockmanagement.BlockManager: Storage policy satisfier is disabled
2021-02-12 13:06:35,969 INFO blockmanagement.BlockManager: dfs.block.access.token.enable = false
2021-02-12 13:06:35,973 INFO blockmanagement.BlockManagerSafeMode: dfs.namenode.safemode.threshold-pct = 0.999
2021-02-12 13:06:35,973 INFO blockmanagement.BlockManagerSafeMode: dfs.namenode.safemode.min.datanodes = 0
2021-02-12 13:06:35,974 INFO blockmanagement.BlockManagerSafeMode: dfs.namenode.safemode.extension = 30000
2021-02-12 13:06:35,975 INFO blockmanagement.BlockManager: defaultReplication = 1
2021-02-12 13:06:35,975 INFO blockmanagement.BlockManager: maxReplication = 512
2021-02-12 13:06:35,975 INFO blockmanagement.BlockManager: minReplication = 1
2021-02-12 13:06:35,976 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
2021-02-12 13:06:35,976 INFO blockmanagement.BlockManager: redundancyRecheckInterval = 3000ms
2021-02-12 13:06:35,976 INFO blockmanagement.BlockManager: encryptDataTransfer = false
2021-02-12 13:06:35,977 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 1000
2021-02-12 13:06:35,995 INFO namenode.FSDirectory: GLOBAL serial map: bits=29 maxEntries=536870911
2021-02-12 13:06:35,995 INFO namenode.FSDirectory: USER serial map: bits=24 maxEntries=16777215
2021-02-12 13:06:35,996 INFO namenode.FSDirectory: GROUP serial map: bits=24 maxEntries=16777215
2021-02-12 13:06:35,996 INFO namenode.FSDirectory: XATTR serial map: bits=24 maxEntries=16777215
2021-02-12 13:06:36,006 INFO util.GSet: Computing capacity for map INodeMap
2021-02-12 13:06:36,006 INFO util.GSet: VM type = 64-bit
2021-02-12 13:06:36,007 INFO util.GSet: 1.0% max memory 889 MB = 8.9 MB
2021-02-12 13:06:36,007 INFO util.GSet: capacity = 2^20 = 1048576 entries
2021-02-12 13:06:36,007 INFO namenode.FSDirectory: ACLs enabled? true
2021-02-12 13:06:36,008 INFO namenode.FSDirectory: POSIX ACL inheritance enabled? true
2021-02-12 13:06:36,008 INFO namenode.FSDirectory: XAttrs enabled? true
2021-02-12 13:06:36,008 INFO namenode.NameNode: Caching file names occurring more than 10 times
2021-02-12 13:06:36,012 INFO snapshot.SnapshotManager: Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
2021-02-12 13:06:36,013 INFO snapshot.SnapshotManager: SkipList is disabled
2021-02-12 13:06:36,016 INFO util.GSet: Computing capacity for map cachedBlocks
2021-02-12 13:06:36,016 INFO util.GSet: VM type = 64-bit
2021-02-12 13:06:36,017 INFO util.GSet: 0.25% max memory 889 MB = 2.2 MB
2021-02-12 13:06:36,017 INFO util.GSet: capacity = 2^18 = 262144 entries
2021-02-12 13:06:36,039 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
2021-02-12 13:06:36,039 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
2021-02-12 13:06:36,040 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
2021-02-12 13:06:36,042 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
2021-02-12 13:06:36,042 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
2021-02-12 13:06:36,044 INFO util.GSet: Computing capacity for map NameNodeRetryCache
2021-02-12 13:06:36,044 INFO util.GSet: VM type = 64-bit
2021-02-12 13:06:36,044 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
2021-02-12 13:06:36,045 INFO util.GSet: capacity = 2^15 = 32768 entries
2021-02-12 13:06:36,055 ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:793)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:1240)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:690)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:642)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:386)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:242)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:1197)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:779)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:673)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:760)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:1014)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:987)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1756)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1821)
2021-02-12 13:06:36,057 INFO util.ExitUtil: Exiting with status 1: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
2021-02-12 13:06:36,059 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at DESKTOP-3UP7473/192.168.93.209
************************************************************/
C:\Windows\system32>
Have you downloaded the winutils native libraries (Step 3 of this article)? It seems is is not there or is not the correct version for your system.
Can you run the following commands:
winutils
It should print out the following text:
Usage: winutils [command] ... Provide basic command line utilities for Hadoop on Windows. The available commands and their usages are: chmod Change file mode bits. Usage: chmod [OPTION] OCTAL-MODE [FILE] or: chmod [OPTION] MODE [FILE] Change the mode of the FILE to MODE. -R: change files and directories recursively Each MODE is of the form '[ugoa]*([-+=]([rwxX]*|[ugo]))+'. chown Change file owner. Usage: chown [OWNER][:[GROUP]] [FILE] Change the owner and/or group of the FILE to OWNER and/or GROUP. Note: On Linux, if a colon but no group name follows the user name, the group of the files is changed to that user's login group. Windows has no concept of a user's login group. So we do not change the group owner in this case. groups List user groups. .....
yes I did, the issue I think is I'm on windows Home
I have not tried install Hadoop on Windows 10 Home thus I'm not sure whether that is the case or not.
Technically as long as you can run run Java applications and also if it is 64bit (instead of 32bit), the instructions I provided should work.
Hi all, thanks very much for this guide!!
just for your information:
1) install jdk under folder without space (e.g. c:\java\jdk1.8.0_271) otherwise it doesn't work
2) under path command use \bin (/bin doesn't work)
3) download winutils from github, don't use PS script (it doesn't work)
4) Yarn nodemanager return an error:
2020-11-10 17:02:12,840 ERROR util.SysInfoWindows: ExitCodeException exitCode=1: PdhAddCounter \Network Interface(*)\Bytes Received/Sec failed with 0xc0000bb8.
Error in GetDiskAndNetwork. Err:1
at org.apache.hadoop.util.Shell.run(Shell.java:901)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1213)
at org.apache.hadoop.util.SysInfoWindows.getSystemInfoInfoFromShell(SysInfoWindows.java:86)
at org.apache.hadoop.util.SysInfoWindows.refreshIfNeeded(SysInfoWindows.java:101)
at org.apache.hadoop.util.SysInfoWindows.getNumVCoresUsed(SysInfoWindows.java:212)
at org.apache.hadoop.yarn.util.ResourceCalculatorPlugin.getNumVCoresUsed(ResourceCalculatorPlugin.java:138)
at org.apache.hadoop.yarn.server.nodemanager.NodeResourceMonitorImpl$MonitoringThread.run(NodeResourceMonitorImpl.java:148)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:1008)
Hi Cosmo,
Did you finish the installation successfully?
PS download should work if you follow the steps exactly. It may not work if there is network proxy, etc. For that scenario, please directly from GitHub.
I am new to hadoop. I try the hadoop3.3.0 and hadoop3.2.1 and use the winutils.The version of winutils is correct. But when I run a simple spark application , I get the error --->Cannot run program "Hadoop\hadoop-3.2.1\bin\winutils.exe. CreateProcess error=216, This version of %1 is not compatible with Windows.
Do you know how to solve it? thank you!
Did you follow this installation guide to install Hadoop 3.3.0 on Windows 10? I would suggest you follow this guide to avoid any unexpected errors. This guide was tested in brand new Windows 10 environment.
From the error message, it looks like winutils.exe program is not compatible with Windows. Can you run winutils.exe directly in Command Prompt or PowerShell?
Are your Windows 32bit? and is your winutils.exe 64 bit?
After writing winutils.exe, I get the error:
Program 'winutils.exe' failed to run: The specified executable is not a valid application for this OS platform.At line:
1 char:1
+ .\winutils.exe
+ ~~~~~~~~~~~~~~.
At line:1 char:1
+ .\winutils.exe
+ ~~~~~~~~~~~~~~
+ CategoryInfo : ResourceUnavailable: (:) [], ApplicationFailedException
+ FullyQualifiedErrorId : NativeCommandFailed