search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark Column | isNull method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark Column's isNull() method identifies rows where the value is null.

Return Value

A PySpark Column (pyspark.sql.column.Column).

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 25], ["Bob", 30], ["Cathy", None]], ["name", "age"])
df.show()
+-----+----+
| name| age|
+-----+----+
| Alex| 25|
| Bob| 30|
|Cathy|null|
+-----+----+

Identifying rows where certain value is null in PySpark DataFrame

To identify rows where the value for age is null:

df.select(df.age.isNull()).show()
+-----------------+
|(age IS NOT NULL)|
+-----------------+
| true|
| true|
| false|
+-----------------+

Getting rows where certain value is null in PySpark DataFrame

To get rows where the value for age is null:

df.where(df.age.isNull()).show()
+-----+----+
| name| age|
+-----+----+
|Cathy|null|
+-----+----+

Here, the where(~) method fetches rows that correspond to True in the boolean column returned by the isNull() method.

Warning - using equality to compare null values

One common mistake is to use equality to compare null values. For example, consider the following DataFrame:

df = spark.createDataFrame([["Alex", 25.0], ["Bob", 30.0], ["Cathy", None]], ["name", "age"])
df.show()
+-----+----+
| name| age|
+-----+----+
| Alex|25.0|
| Bob|30.0|
|Cathy|null|
+-----+----+

Let's get the rows where age is equal to None:

from pyspark.sql import functions as F
df.where(F.col("age") == None).show()
+----+---+
|name|age|
+----+---+
+----+---+

Notice how Cathy's row where the age is null is not picked up. When comparing null values, we should always use isNull() instead.

Null values and NaN are treated differently

Consider the following PySpark DataFrame:

import numpy as np
df = spark.createDataFrame([["Alex", 25.0], ["Bob", np.nan], ["Cathy", None]], ["name", "age"])
df.show()
+-----+----+
| name| age|
+-----+----+
| Alex|25.0|
| Bob| NaN|
|Cathy|null|
+-----+----+

Here, the age column contains both NaN and null. In PySpark, NaN and null are treated as different entities as demonstrated below:

df.where(F.col("age").isNull()).show()
+-----+----+
| name| age|
+-----+----+
|Cathy|null|
+-----+----+

Here, notice how Bob's row whose age is NaN is not picked up. To get rows with NaN, use the isnan(-) method like so:

df.where(F.isnan("age")).show()
+----+---+
|name|age|
+----+---+
| Bob|NaN|
+----+---+
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...