Tag - pyspark

lite-log spark pyspark

Fix PySpark TypeError: field **: **Type can not accept object ** in type <class '*'>

130   0   about 2 months ago

When creating Spark date frame using schemas, you may encounter errors about “field **: **Type can not accept object ** in type &lt;class '*'&gt;”. The actual error can vary, for instances, the following are some examples: field xxx: BooleanType can not accept object 100 in type ...

View detail
python spark pyspark

PySpark: Convert Python Array/List to Spark Data Frame

233   0   about 2 months ago

In Spark, SparkContext.parallelize function can be used to convert Python list to RDD and then RDD can be converted to DataFrame object. The following sample code is based on Spark 2.x. In this page, I am going to show you how to convert the following list to a data frame: data = [(...

View detail
teradata spark pyspark

Load Data from Teradata in Spark (PySpark)

251   0   about 2 months ago

In my article Connect to Teradata database through Python , I demonstrated about how to use Teradata python package or Teradata ODBC driver to connect to Teradata. In this article, I’m going to...

View detail
python spark hadoop pyspark

Read Hadoop Credential in PySpark

107   0   about 2 months ago

In one of my previous articles about Password Security Solution for Sqoop , I mentioned creating credential using hadoop credential command. The credentials are stored in JavaKey...

View detail
spark pyspark partitioning

Data Partitioning Functions in Spark (PySpark) Deep Dive

157   0   about 5 months ago

In my previous post about Data Partitioning in Spark (PySpark) In-depth Walkthrough , I mentioned how to repartition data frames in Spark using repartition ...

View detail
lite-log spark pyspark

Get the Current Spark Context Settings/Configurations

88   0   about 5 months ago

In Spark, there are a number of settings/configurations you can specify including application properties and runtime parameters. https://spark.apache.org/docs/latest/configuration.html Ge...

View detail
lite-log spark pyspark hive

Read Data from Hive in Spark 1.x and 2.x

87   0   about 5 months ago

Spark 2.x Form Spark 2.0, you can use Spark session builder to enable Hive support directly. The following example (Python) shows how to implement it. from pyspark.sql import SparkSession appName = "PySpark Hive Example" master = "local" # Create Spark session with Hive...

View detail
python spark pyspark

Data Partitioning in Spark (PySpark) In-depth Walkthrough

379   0   about 5 months ago

Data partitioning is critical to data processing performance especially for large volume of data processing in Spark. Partitions in Spark won’t span across nodes though one node can contains more than one partitions. When processing, Spark assigns one task for each partition and each worker threa...

View detail
python lite-log spark pyspark

PySpark - Fix PermissionError: [WinError 5] Access is denied

416   0   about 5 months ago

When running pyspark or spark-submit command in Windows to execute python scripts, you may encounter the following error: PermissionError: [WinError 5] Access is denied As it’s self-explained, permissions are not setup correctly. To resolve this issue y...

View detail
python spark pyspark hive

Spark - Save DataFrame to Hive Table

2,692   0   about 5 months ago

From Spark 2.0, you can easily read data from Hive data warehouse and also write/append new data to Hive tables. This page shows how to operate with Hive in Spark including: Create DataFrame from existing Hive table Save DataFrame to a new Hive table Append data ...

View detail