Read JSON file as Spark DataFrame in Scala / Spark

access_time 2 years ago visibility1638 comment 0

Spark has easy fluent APIs that can be used to read data from JSON file as DataFrame object. 

In this code example,  JSON file named 'example.json' has the following content:

[

  {

    "Category": "Category A",

    "Count": 100,

    "Description": "This is category A"

  },

  {

    "Category": "Category B",

    "Count": 120,

    "Description": "This is category B"

  },

  {

    "Category": "Category C",

    "Count": 150,

    "Description": "This is category C"

  }

]

In the code snippet, the following option is important to let Spark to handle multiple line JSON content:

option("multiLine", true)

Code snippet

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.types._

val appName = "Scala Example - JSON file to Spark Data Frame"
val master = "local"

/*Create Spark session with Hive supported.*/
val spark = SparkSession.builder.appName(appName).master(master).getOrCreate()

val schema = StructType(Seq(
  StructField("Category", StringType, true),
StructField("Count", IntegerType, true),
StructField("Description", StringType, true)
))

val json_file_path = "data/example.json"
val df = spark.read.option("multiLine", true).schema(schema).json(json_file_path)
print(df.schema)
df.show()
info Last modified by Raymond 2 years ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn or Twitter.

Want to publish your article on Kontext?

Learn more

More from Kontext

PySpark Read Multiple Lines Records from CSV
visibility 2877
thumb_up 0
access_time 11 months ago

CSV is a common format used when extracting and exchanging data between systems and platforms. Once CSV file is ingested into HDFS, you can easily read them as DataFrame in Spark. However there are a few options you need to pay attention to especially if you source file: Has records across ...

visibility 31
thumb_up 0
access_time 2 months ago

This article shows how to add a constant or literal column to Spark data frame using Scala.  Follow article  Scala: Convert List to Spark Data Frame to construct a Spark data frame. +----------+-----+------------------+ | Category|Count| ...

visibility 20007
thumb_up 0
access_time 2 years ago

Spark provides rich APIs to save data frames to many different formats of files such as CSV, Parquet, Orc, Avro, etc. CSV is commonly used in data application though nowadays binary formats are getting momentum. In this article, I am going to show you how to save Spark data frame as CSV file in ...