Spark SQL - Convert Object to JSON String
In article Scala: Parse JSON String as Spark DataFrame, it shows how to convert JSON string to Spark DataFrame; this article show the other way around - convert complex columns to a JSON string using to_json function.
About function to_json
Function 'to_json(expr[, options])' returns a JSON string with a given struct value.
For parameter options, it controls how the struct column is converted into a JSON string and accepts the same options as the JSON data source. Refer to Spark SQL - Convert JSON String to Map for more details about all the available options.
Code snippet
select to_json(map(1, 'a', 2, 'b', 3, DATE '2021-01-01'));
Output:
to_json(map(1, a, 2, b, 3, CAST(DATE '2021-01-01' AS STRING))) {"1":"a","2":"b","3":"2021-01-01"}
The following example changes date format:
select to_json(map(1, DATE '2021-01-01'), map('dateFormat','dd/MM/yyyy'));
Output:
spark-sql> select to_json(map(1, DATE '2021-01-01'), map('dateFormat','dd/MM/yyyy')); to_json(map(1, DATE '2021-01-01')) {"1":"01/01/2021"}
Note: the format of DATE is changed to dd/MM/yyyy as specified by options parameter.
copyright
This page is subject to Site terms.
comment Comments
No comments yet.