By using this site, you acknowledge that you have read and understand our Cookie policy, Privacy policy and Terms .
visibility 2830 arrow_upward arrow_downward

Are you a Windows/.NET developer and willing to learn big data concepts and tools in your Windows?

If yes, you can follow the links below to install them in your PC. The installations are usually easier to do in Linux/UNIX but they are not difficult to implement in Windows either since they are based on Java.

Installation guides

All the following documents are based on Windows 10. The steps should be the same in other Windows environments though some of the screenshots may be different.

Install Zeppelin 0.7.3 in Windows

Install Hadoop 3.0.0 in Windows (Single Node)

Install Spark 2.2.1 in Windows

Install Apache Sqoop in Windows

Configure Hadoop 3.1.0 in a Multi Node Cluster

Apache Hive 3.0.0 Installation on Windows 10 Step by Step Guide

Learning tutorials

Use Hadoop File System Task in SSIS to Write File into HDFS
Invoke Hadoop WebHDFS APIs in .NET Core

Write and Read Parquet Files in Spark/Scala

Write and Read Parquet Files in HDFS through Spark/Scala

Convert String to Date in Spark (Scala)

Read Text File from Hadoop in Zeppelin through Spark Context

Connecting Apache Zeppelin to your SQL Server

Load Data into HDFS from SQL Server via Sqoop

Default Ports Used by Hadoop Services (HDFS, MapReduce, YARN)

Connect to SQL Server in Spark (PySpark)

Implement SCD Type 2 Full Merge via Spark Data Frames

Password Security Solution for Sqoop

PySpark: Convert JSON String Column to Array of Object (StructType) in Data Frame

Spark - Save DataFrame to Hive Table

Copy Files from Hadoop HDFS to Local

Data Partitioning in Spark (PySpark) In-depth Walkthrough

Data Partitioning Functions in Spark (PySpark) Deep Dive

Read Data from Hive in Spark 1.x and 2.x

Get the Current Spark Context Settings/Configurations

PySpark - Fix PermissionError: [WinError 5] Access is denied

Configure a SQL Server Database as Remote Hive Metastore

Connect to Hive via HiveServer2 JDBC Driver

I will be constantly updating my blog with tutorials. Feel free to subscribe this blog (RSS).

info Last modified by Raymond at 9 months ago

Please log in or register to comment. account_circle Log in person_add Register
comment Comments (2)
account_circle Raymondarrow_drop_down

Yes, you can load your text file into hdfs via CLI, WebHDFS api or any other tools/programming that supports this. You can then do transformations using tools like Apache Beam, Spark or notebooks (Zeppeline or Jupyter), etc.These tools also can then write into sql server database through ODBC/JDBC or native SQL Server drivers.


format_quote

person Rajesh access_time 9 months ago
Re: Install Big Data Tools (Spark, Zeppelin, Hadoop) in Windows for Learning and Practice

Can we load date from text file to HDFS and do calculations/corrections and load the data to SQL SERVER DB table

reply Reply
account_circle Rajesh

Can we load date from text file to HDFS and do calculations/corrections and load the data to SQL SERVER DB table


reply Reply
account_circle Raymond

Articles about Apache Spark

open_in_new View

local_offer Tags


info License/Terms