Raymond Raymond | Spark & PySpark

Spark SQL - Map Functions

event 2021-01-09 visibility 5,122 comment 0 insights toc
insights Stats

In Spark SQL, MapType is designed for key values, which is like dictionary object type in many other programming languages. This article summarize the commonly used map functions in Spark SQL.


Function map is used to create a map. 


spark-sql> select map(1,'a',2,'b',3,'c');
map(1, a, 2, b, 3, c)


Function map_contact is used to union two maps.


spark-sql> select map_concat(map(1,'a',2,'b',3,'c'),map(4,'d'));
map_concat(map(1, a, 2, b, 3, c), map(4, d))
warning Warning - if the keys are not unique in each map, error will throw out:  java.lang.RuntimeException: Duplicate map key 1 was found, please check the input data. If you want to remove the duplicated keys, you can set spark.sql.mapKeyDedupPolicy to LAST_WIN so that the key inserted at last takes precedence.


This function returns an array of all the items in the map in an unordered manner.


spark-sql> select map_entries(map(1,'a',2,'b',3,'c',4,'d'));
map_entries(map(1, a, 2, b, 3, c, 4, d))


Function map_keys returns all the keys of a map in an unordered array.


spark-sql> select map_keys(map(1,'a',2,'b',3,'c',4,'d'));
map_keys(map(1, a, 2, b, 3, c, 4, d))


Function map_values returns all the values of a map in an unordered array.


spark-sql> select map_values(map(1,'a',2,'b',3,'c',4,'d'));
map_values(map(1, a, 2, b, 3, c, 4, d))


This function constructs a map from an array of entries. 


spark-sql> SELECT map_from_entries(array(struct('A', 1), struct('B', 2), struct('C', 3)));
map_from_entries(array(struct(A, 1), struct(B, 2), struct(C, 3)))
More from Kontext
comment Comments
No comments yet.

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts