search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark SQL Functions | last method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark's SQL function last(~) method returns the last row of the DataFrame.

Parameters

1. col | string or Column object

The column label or Column object of interest.

2. ignorenulls | boolean | optional

Whether or not to ignore null values. By default, ignorenulls=False.

Return Value

A PySpark SQL Column object (pyspark.sql.column.Column).

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([("Alex", 15), ("Bob", 20), ("Cathy", 25)], ["name", "age"])
df.show()
+-----+---+
| name|age|
+-----+---+
| Alex| 15|
| Bob| 20|
|Cathy| 25|
+-----+---+

Getting the last value of a PySpark column

To get the last value of the name column:

df.select(F.last("name")).show()
+----------+
|last(name)|
+----------+
| Cathy|
+----------+

Note we can also pass a Column object instead:

import pyspark.sql.functions as F
# df.select(F.last(F.col("name"))).show()
df.select(F.last(df.name)).show()
+----------+
|last(name)|
+----------+
| Cathy|
+----------+

Getting the last non-null value in PySpark column

Consider the following PySpark DataFrame with null values:

df = spark.createDataFrame([("Alex", 15), ("Bob", 20), ("Cathy", None)], ["name", "age"])
df.show()
+-----+----+
| name| age|
+-----+----+
| Alex| 15|
| Bob| 20|
|Cathy|null|
+-----+----+

By default, ignorenulls=False, which means that the last value is returned regardless of whether it is null or not:

df.select(F.last(df.age)).show()
+---------+
|last(age)|
+---------+
| null|
+---------+

To return the last non-null value instead, set ignorenulls=True:

df.select(F.last(df.age, ignorenulls=True)).show()
+---------+
|last(age)|
+---------+
| 20|
+---------+

Getting the last value of each group in PySpark

The last(~) method is also useful in aggregations. Consider the following PySpark DataFrame:

data = [("Alex", "A"), ("Alex", "B"), ("Bob", None), ("Bob", "A"), ("Cathy", "C")]
df = spark.createDataFrame(data, ["name", "class"])
df.show()
+-----+-----+
| name|class|
+-----+-----+
| Alex| A|
| Alex| B|
| Bob| null|
| Bob| A|
|Cathy| C|
+-----+-----+

To get the last value of each aggregate:

df.groupby("name").agg(F.last("class")).show()
+-----+-----------+
| name|last(class)|
+-----+-----------+
| Alex| B|
| Bob| A|
|Cahty| C|
+-----+-----------+

Here, we are grouping by name, and then for each of these group, we are obtaining the last value that occurred in the class column.

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
thumb_up
1
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!