This site uses cookies to deliver our services. By using this site, you acknowledge that you have read and understand our Cookie and Privacy policy. Your use of Kontext website is subject to this policy. Allow Cookies and Dismiss

Install Spark 2.2.1 in Windows

381 views 0 comments last modified about 11 months ago Raymond Tang

lite-log spark

This page summarizes the steps to install Spark 2.2.1 in your Windows environment.

Tools and Environment

  • GIT Bash
  • Command Prompt
  • Windows 10

Download Binary Package

Download the latest binary from the following site:

https://spark.apache.org/downloads.html

In my case, I am saving the file to folder: F:\DataAnalytics.

UnZip binary package

Open Git Bash, and change directory (cd) to the folder where you save the binary package and then unzip:

$ cd F:\DataAnalytics

fahao@Raymond-Alienware MINGW64 /f/DataAnalytics
$ tar -xvzf   spark-2.2.1-bin-hadoop2.7.tgz

In my case, spark is extracted to: F:\DataAnalytics\spark-2.2.1-bin-hadoop2.7

Setup environment variables

JAVA_HOME

Follow section ‘JAVA_HOME environment variable’ in the following page to setup JAVA_HOME

http://kontext.tech/docs/DataAndBusinessIntelligence/p/install-zeppelin-073-in-windows

SPARK_HOME

Setup SPARK_HOME environment variable with value of your spark installation directory.

image

PATH

Added ‘%SPARK_HOME%\bin’ to your path environment variable.

Verify the installation

Verify command

Run the following command in Command Prompt to verify the installation.

%SPARK_HOME%\bin\spark-shell

The screen should be similar to the following screenshot:

image

Run examples

Execute the following command in Command Prompt to run one example provided as part of Spark installation (class SparkPi with param 10).

https://spark.apache.org/docs/latest/

%SPARK_HOME%\bin\run-example.cmd SparkPi 10

The output looks like the following:
image

Spark context UI

As printed out, Spark context Web UI available at http://172.24.144.1:4040.

The following is a screenshot of the UI:

image

Spark developer tools

Refer to the following page if you are interested in any Spark developer tools.

https://spark.apache.org/developer-tools.html

Related pages

PySpark: Convert JSON String Column to Array of Object (StructType) in Data Frame

21 views   0 comments last modified about 14 days ago

This post shows how to derive new column in a Spark data frame from a JSON array string column. I am running the code in Spark 2.2.1 though it is compatible with Spark 1.6.0 (with less JSON SQL functions). Prerequisites Refer to the following post to install Spark in Windows. ...

View detail

Write and Read Parquet Files in Spark/Scala

5161 views   2 comments last modified about 11 months ago

In this page, I’m going to demonstrate how to write and read parquet files in Spark/Scala by using Spark SQLContext class. Reference What is parquet format? Go the following project site to understand more about parquet. ...

View detail

Install Big Data Tools (Spark, Zeppelin, Hadoop) in Windows for Learning and Practice

981 views   2 comments last modified about 9 months ago

Are you a Windows/.NET developer and willing to learn big data concepts and tools in your Windows? If yes, you can follow the links below to install them in your PC. The installations are usually easier to do in Linux/UNIX but they are not difficult to implement in Windows either since the...

View detail

Load Data into HDFS from SQL Server via Sqoop

803 views   0 comments last modified about 10 months ago

This page shows how to import data from SQL Server into Hadoop via Apache Sqoop. Prerequisites Please follow the link below to install Sqoop in your machine if you don’t have one environment ready. ...

View detail

Write and Read Parquet Files in HDFS through Spark/Scala

2968 views   0 comments last modified about 11 months ago

In my previous post, I demonstrated how to write and read parquet files in Spark/Scala. The parquet file destination is a local folder. Write and Read Parquet Files in Spark/Scala In this page...

View detail

Convert String to Date in Spark (Scala)

2821 views   0 comments last modified about 11 months ago

Context This pages demonstrates how to convert string to java.util.Date in Spark via Scala. Prerequisites If you have not installed Spark, follow the page below to install it: ...

View detail

Add comment

Please login first to add comments.  Log in New user?  Register

Comments (0)

No comments yet.

Contacts

  • enquiry[at]kontext.tech

Subscribe