search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark SQL Functions | when method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark SQL Functions' when(~) method is used to update values of a PySpark DataFrame column to other values based on the given conditions.

NOTE

The when(~) method is often used in conjunction with the otherwise(~) method to implement an if-else logic. See examples below for clarification.

Parameters

1. condition | Column | optional

A boolean Column expression. See examples below for clarification.

2. value | any | optional

The value to map to if the condition is true.

Return Value

A PySpark Column (pyspark.sql.column.Column).

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 20], ["Bob", 24], ["Cathy", 22]], ["name", "age"])
df.show()
+-----+---+
| name|age|
+-----+---+
| Alex| 20|
| Bob| 24|
|Cathy| 22|
+-----+---+

Implementing if-else logic using when and otherwise

To rename the name Alex to Doge, and others to Eric:

import pyspark.sql.functions as F
df.select(F.when(df.name == "Alex", "Doge").otherwise("Eric")).show()
+-----------------------------------------------+
|CASE WHEN (name = Alex) THEN Doge ELSE Eric END|
+-----------------------------------------------+
| Doge|
| Eric|
| Eric|
+-----------------------------------------------+

Notice how we used the method otherwise(~) to set values for cases when the conditions are not met.

Case when otherwise method is not used

Note that if you do not include the otherwise(~) method, then any value that does not fulfil the if condition will be assigned null:

df.select(F.when(df.name == "Alex", "Doge")).show()
+-------------------------------------+
|CASE WHEN (name = Alex) THEN Doge END|
+-------------------------------------+
| Doge|
| null|
| null|
+-------------------------------------+

Specifying multiple conditions

Using pipeline and ampersand operator

We can combine conditions using & (and) and | (or) like so:

df.withColumn("name", F.when((df.name == "Alex") & (df.age > 10), "Doge").otherwise("Eric")).show()
+----+---+
|name|age|
+----+---+
|Doge| 20|
|Eric| 24|
|Eric| 22|
+----+---+

Chaining the when method

The when(~) method can be chained like so:

df.select(F.when(df.name == "Alex", "Doge")
.when(df.name == "Bob", "Zebra")
.otherwise("Eric")).show()
+----------------------------------------------------------------------------+
|CASE WHEN (name = Alex) THEN Doge WHEN (name = Bob) THEN Zebra ELSE Eric END|
+----------------------------------------------------------------------------+
| Doge|
| Zebra|
| Eric|
+----------------------------------------------------------------------------+

Setting a new value based on original value

To set a new value based on the original value:

import pyspark.sql.functions as F
df.select(F.when(df.age > 15, df.age + 30)).show()
+----------------------------------------+
|CASE WHEN (age > 15) THEN (age + 30) END|
+----------------------------------------+
| 50|
| 54|
| 52|
+----------------------------------------+

Using an alias

By default, the new column label is convoluted:

import pyspark.sql.functions as F
df.select(F.when(df.name == "Alex", "Doge").otherwise("Eric")).show()
+-----------------------------------------------+
|CASE WHEN (name = Alex) THEN Doge ELSE Eric END|
+-----------------------------------------------+
| Doge|
| Eric|
| Eric|
+-----------------------------------------------+

To assign a new column, simply use the alias(~) method:

import pyspark.sql.functions as F
df.select(F.when(df.name == "Alex", "Doge").otherwise("Eric").alias("new_name")).show()
+--------+
|new_name|
+--------+
| Doge|
| Eric|
| Eric|
+--------+
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...