Code python

Use when() and otherwise() with PySpark DataFrame

Kontext Kontext visibility 1,924 comment 0 access_time 2 years ago language English

descriptionCode description

In Spark SQL, CASE WHEN clause can be used to evaluate a list of conditions and to return one of the multiple results for each column. The same can be implemented directly using pyspark.sql.functions.when and pyspark.sql.Column.otherwise functions. If otherwise is not used together with when, None will be returned for unmatched conditions. 

Output:

+---+------+
| id|id_new|
+---+------+
|  1|     1|
|  2|   200|
|  3|  3000|
|  4|   400|
|  5|     5|
|  6|   600|
|  7|     7|
|  8|   800|
|  9|  9000|
+---+------+
fork_rightFork
more_vert
copyright This page is subject to Site terms.
comment Comments
No comments yet.

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts