search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark DataFrame | where method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark DataFrame's where(~) method returns rows of the DataFrame that satisfies the given condition.

NOTE

The where(~) method is an alias for the filter(~) method.

Parameters

1. condition | Column or string

A boolean mask (Column) or a SQL string expression.

Return Value

A new PySpark DataFrame.

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 20], ["Bob", 30], ["Cathy", 40]], ["name", "age"])
df.show()
+-----+---+
| name|age|
+-----+---+
| Alex| 20|
| Bob| 30|
|Cathy| 40|
+-----+---+

Basic usage

To get rows where age is greater than 25:

df.where("age > 25").show()
+-----+---+
| name|age|
+-----+---+
| Bob| 30|
|Cathy| 40|
+-----+---+

Equivalently, we can pass a Column object that represents a boolean mask:

df.where(df.age > 25).show()
+-----+---+
| name|age|
+-----+---+
| Bob| 30|
|Cathy| 40|
+-----+---+

Equivalently, we can use the col(~) function of sql.functions to refer to the column:

from pyspark.sql import functions as F
df.where(F.col("age") > 25).show()
+-----+---+
| name|age|
+-----+---+
| Bob| 30|
|Cathy| 40|
+-----+---+

Compound queries

The where(~) method supports the AND and OR statement like so:

df.where("age > 25 AND name = 'Bob'").show()
+----+---+
|name|age|
+----+---+
| Bob| 30|
+----+---+

Dealing with null values

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 20], [None, None], ["Cathy", None]], ["name", "age"])
df.show()
+-----+----+
| name| age|
+-----+----+
| Alex| 20|
| null|null|
|Cathy|null|
+-----+----+

Let's query for rows where age!=10 like so:

df.where("age != 10").show()
+----+---+
|name|age|
+----+---+
|Alex| 20|
+----+---+

Notice how only Alex's row is returned even though the other two rows technically have age!=10. This happens because PySpark's where(-) method filters our null values by default.

To prevent rows with null values getting filtered out, we can perform the query like so:

from pyspark.sql import functions as F
df.where((F.col("age") != 10) | (F.col("age").isNull())).show()
+-----+----+
| name| age|
+-----+----+
| Alex| 20|
| null|null|
|Cathy|null|
+-----+----+

Note that PySpark's treatment of null values is different compared to Pandas because Pandas will retain rows with missing values, as demonstrated below:

import pandas as pd

df = pd.DataFrame({
"col": ["a", "b", None]
})

df[df["col"] != "a"]
col
1 b
2 None

Notice how the row with col=None is not left out!

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!