search
Search
Login
Map of Data Science
menu
menu search toc more_vert
Robocat
Guest 0reps
Sign up
Log in
account_circleMy Profile homeAbout paidPricing
emailContact us
exit_to_appLog out
Map of data science
Thanks for the thanks!
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook

PySpark SQL Functions | min method

Machine Learning
chevron_right
PySpark
chevron_right
Documentation
chevron_right
PySpark SQL Functions
schedule Jul 1, 2022
Last updated
local_offer PySpark
Tags
map
Check out the interactive map of data science

PySpark SQL Functions' min(~) method returns the minimum value in the specified column.

Parameters

1. col | string or Column

The column in which to obtain the minimum value.

Return Value

A PySpark Column (pyspark.sql.column.Column).

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 25], ["Bob", 30]], ["name", "age"])
df.show()
+----+---+
|name|age|
+----+---+
|Alex| 25|
| Bob| 30|
+----+---+

Getting the minimum value of a PySpark column

To obtain the minimum age:

import pyspark.sql.functions as F
df.select(F.min("age")).show()
+--------+
|min(age)|
+--------+
| 25|
+--------+

To get the minimum value as an integer:

list_rows = df.select(F.min("age")).collect()
list_rows[0][0]
25

Note the following:

  • the collect() method converts the PySpark DataFrame returned by select(~) to a list of Row objects.

  • this list will always be of length one when we apply the min(~) method.

  • the content of the Row object can be accessed using [0].

Getting the minimum value of each group in PySpark

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 20, "A"],\
["Bob", 30, "B"],\
["Cathy", 50, "A"]],
["name", "age", "class"])
df.show()
+-----+---+-----+
| name|age|class|
+-----+---+-----+
| Alex| 20| A|
| Bob| 30| B|
|Cathy| 50| A|
+-----+---+-----+

To get the minimum age of each class:

df.groupby("class").agg(F.min("age").alias("MIN AGE")).show()
+-----+-------+
|class|MIN AGE|
+-----+-------+
| A| 20|
| B| 30|
+-----+-------+

Here, the alias(~) method is used to assign a label to the aggregated age column.

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!