search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark DataFrame | foreach method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
tocTable of Contents
expand_more
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark DataFrame's foreach(~) method loops over each row of the DataFrame as a Row object and applies the given function to the row.

WARNING

The following are some limitations of foreach(~):

  • the foreach(~) method in Spark is invoked in the worker nodes instead of the Driver program. This means that if we perform a print(~) inside our function, we will not be able to see the printed results in our session or notebook because the results are printed in the worker node instead.

  • rows are read-only and so you cannot update values of the rows.

Given these limitations, the foreach(~) method is mainly used for logging some information about each row to the local machine or to an external database.

Parameters

1. f | function

The function to apply to each row (Row) of the DataFrame.

Return Value

Nothing is returned.

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 20], ["Bob", 30]], ["name", "age"])
df.show()
+----+---+
|name|age|
+----+---+
|Alex| 20|
| Bob| 30|
+----+---+

To iterate over each row and apply some custom function:

# This function fires in the worker node
def f(row):
print(row.name)

df.foreach(f)

Here, the row.name is printed in the worker nodes so you would not see any output in the driver program.

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
3
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!