search
Search
Join our weekly DS/ML newsletter layers DS/ML Guides
menu
menu search toc more_vert
Robocat
Guest 0reps
Thanks for the thanks!
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook

PySpark SQL Functions | expr method

Machine Learning
chevron_right
PySpark
chevron_right
Documentation
chevron_right
PySpark SQL Functions
schedule Jul 1, 2022
Last updated
local_offer PySpark
Tags

PySpark SQL Functions' expr(~) method parses the given SQL expression.

Parameters

1. str | string

The SQL expression to parse.

Return Value

A PySpark Column.

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([['Alex',30],['Bob',50]], ['name','age'])
df.show()
+----+---+
|name|age|
+----+---+
|Alex| 30|
| Bob| 50|
+----+---+

Using the expr method to convert column values to uppercase

The expr(~) method takes in as argument a SQL expression, so we can use SQL functions such as upper(~):

import pyspark.sql.functions as F
df.select(F.expr('upper(name)')).show()
+-----------+
|upper(name)|
+-----------+
| ALEX|
| BOB|
+-----------+
NOTE

The expr(~) method can often be more succinctly written using PySpark DataFrame's selectExpr(~) method. For instance, the above case can be rewritten as:

df.selectExpr('upper(name)').show()
+-----------+
|upper(name)|
+-----------+
| ALEX|
| BOB|
+-----------+

I recommend that you use selectExpr(~) whenever possible because:

  • you won't have to import the SQL functions library (pyspark.sql.functions).

  • syntax is shorter

Parsing complex SQL expressions using expr method

Here's a more complex SQL expression using clauses like AND and LIKE:

df.select(F.expr('age > 40 AND name LIKE "B%"').alias('result')).show()
+------+
|result|
+------+
| false|
| true|
+------+

Note the following:

  • we are checking for rows where age is larger than 40 and name starts with B.

  • we are assigning the label 'result' to the Column returned by expr(~) using the alias(~) method.

Practical applications of boolean masks returned by expr method

As we can see in the above example, the expr(~) method can return a boolean mask depending on the SQL expression you supply:

df.select(F.expr('age > 40 AND name LIKE "B%"').alias('result')).show()
+------+
|result|
+------+
| false|
| true|
+------+

This allows us to check for the existence of rows that satisfy a given condition using any(~):

df.select(F.expr('any(age > 40 AND name LIKE "B%")').alias('exists?')).show()
+-------+
|exists?|
+-------+
| true|
+-------+

Here, we get True because there exists at least one True value in the boolean mask.

Mapping column values using expr method

We can map column values using CASE WHEN in the expr(~) method like so:

col = F.expr('CASE WHEN age < 40 THEN "JUNIOR" ELSE "SENIOR" END').alias('result')
df.withColumn('status', col).show()
+----+---+------+
|name|age|status|
+----+---+------+
|Alex| 30|JUNIOR|
| Bob| 50|SENIOR|
+----+---+------+

Here, note the following:

  • we are using the DataFrame's withColumn(~) method to obtain a new PySpark DataFrame that includes the column returned by expr(~).

mail
Join our newsletter for updates on new DS/ML comprehensive guides (spam-free)
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
Ask a question or leave a feedback...