Install and Run Kafka 2.6.0 On Windows 10

access_time 3 months ago visibility413 comment 0

Kafka is a distributed event streaming platform that can be used for high-performance streaming analytics, asynchronous event processing and reliable applications. This article provides step-by-step guidance about installing Kafka on Windows 10 for test and learn purposes. 

Install Git Bash

Download Git Bash from https://git-scm.com/downloads and then install it. We will use it to unzip Kafka binary package. If you have 7-zip of other unzip software, this is then not required. 

Install Java JDK

Java JDK is required to run Kafka. If you have not installed Java JDK, please install it.

1) You can install JDK 8 from the following page:

https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

2) Setup environment variable

Let's configure JAVA_HOME environment variable.

First, we need to find out the location of Java SDK. In my system, the path is: D:\Java\jdk1.8.0_161.

Your location can be different depends on where you install your JDK.

And then run the following command in PowerShell window:

SETX JAVA_HOME "D:\Java\jdk1.8.0_161" 

Remember to quote the path if you have spaces in your JDK path.

3) Add Java bin folder into PATH system variable.


4) Verify java command

Once you complete the installation, please run the following command in PowerShell or Git Bash to verify:

$ java -version
java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)

Download Kafka binary package

1) Go to Kafka download portal and select a version. For this tutorial, version Scala 2.13  - kafka_2.13-2.6.0.tgz is downloaded.

2) Unzip the binary package to a installation folder.

Now we need to unpack the downloaded package using GUI tool (like 7 Zip) or command line. I will use git bash to unpack it.

Open git bash and change the directory to the destination folder:

cd F:/big-data

And then run the following command to unzip:

tar -xvzf  kafka_2.13-2.6.0.tgz

Most of the scripts that we need to run in the following steps are located in bin/windows folder as the screenshot shows:

3) Setup Kafka environment variable.

Let's add a environment variable KAFKA_HOME so that we can easily reference it in the following steps.

Remember to change variable value accordingly based on your environment setup.

Start Kafka environment

1) Open Command Prompt and start ZooKeeper services by running this command:

%KAFKA_HOME%/bin/windows/zookeeper-server-start.bat %KAFKA_HOME%/config/zookeeper.properties

In this version, ZooKeeper is still required.

2) Start Kafka server 

Open another Command Prompt window and run the following command:

%KAFKA_HOME%/bin/windows/kafka-server-start.bat %KAFKA_HOME%/config/server.properties

Once all the services are launched, you will have a Kafka environment ready to use.

You can verify by running jps commands (if you have Hadoop installed in your environment): 

Let's run some tests about Kafka environment. 

Create a Kafka topic

Open another Command Prompt window and run the following command:

%KAFKA_HOME%/bin/windows/kafka-topics.bat --create --topic kontext-events --bootstrap-server localhost:9092

The command will create a topic named kontext-events as the above screenshot shows.

Describe Kafka topic

Run the following command to describe the created topic.

%KAFKA_HOME%/bin/windows/kafka-topics.bat --describe --topic kontext-events --bootstrap-server localhost:9092

The output looks like the following:

Topic: kontext-events   PartitionCount: 1       ReplicationFactor: 1    Configs: segment.bytes=1073741824
        Topic: kontext-events   Partition: 0    Leader: 0       Replicas: 0     Isr: 0

Write some events into the topic

Let's start to write some events into the topic by running the following command:

%KAFKA_HOME%/bin/windows/kafka-console-producer.bat --topic kontext-events --bootstrap-server localhost:9092

Each time represents an event. Let's type into some events:

Press Ctrl + C to terminate this Console producer client. 

Read the events in the topic

Let's read the events by running the following command:

%KAFKA_HOME%/bin/windows/kafka-console-consumer.bat --topic kontext-events --from-beginning --bootstrap-server localhost:9092

The Console consumer client prints out the following events:

Press Ctrl + C to terminate the consumer client. 

Shutdown Kafka services

After finish practices, you can turn off the series by running the following commands:

%KAFKA_HOME%/bin/windows/kafka-server-stop.bat %KAFKA_HOME%/config/server.properties
%KAFKA_HOME%/bin/windows/zookeeper-server-stop.bat %KAFKA_HOME%/config/zookeeper.properties

The output looks like the following screenshot:

References

As you can see, it is very easy to configure and run Kafka on Windows 10. Stay tuned and more articles will be published about streaming analytics with Kafka in this Column. 

info Last modified by Raymond at 3 months ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Want to publish your article on Kontext?

Learn more

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.


Learn more arrow_forward

More from Kontext

local_offer kafka local_offer shell

visibility 210
thumb_up 0
access_time 3 months ago

This page summarizes commonly used Apache Kafka Windows commands.  Setup an environment variable named KAFKA_HOME that points to where Kafka is located.  Example: SET KAFKA_HOME=F:\big-data\kafka_2.13-2.6.0 warning  Remember to change the server address, port number and Kafka ...

local_offer kafka local_offer python

visibility 368
thumb_up 1
access_time 3 months ago

Apache Kafka is written with Scala. Thus, the most natural way is to use Scala (or Java) to call Kafka APIs, for example, Consumer APIs and Producer APIs. For Python developers, there are open source packages available that function similar as official Java clients.  This article shows you ...

Kafka Topic Partitions Walkthrough via Python

local_offer kafka local_offer python

visibility 363
thumb_up 0
access_time 3 months ago

Partition is the parallelism unit in a Kafka cluster. Partitions are replicated in Kafka cluster (cluster of brokers) for fault tolerant and throughput. This articles show you how to work with Kafka partitions using Python as programming language. Package kafka-python will be used in the ...

About column

Streaming analytics related tutorials and ideas.

rss_feed Subscribe RSS