search
Search
Join our weekly DS/ML newsletter layers DS/ML Guides
menu
menu search toc more_vert
Robocat
Guest 0reps
Thanks for the thanks!
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook

PySpark RDD | keys method

Machine Learning
chevron_right
PySpark
chevron_right
Documentation
chevron_right
PySpark RDD
schedule Jul 1, 2022
Last updated
local_offer PySpark
Tags

PySpark RDD's keys(~) method returns the keys of a pair RDD that contains tuples of length two.

Parameters

This method does not take in any parameters.

Return Value

A PySpark RDD (pyspark.rdd.PipelinedRDD).

Examples

Consider the following PySpark pair RDD:

# Create a RDD using the parallelize method
rdd = sc.parallelize([("a",3),("a",2),("b",5),("c",1)])
rdd.collect()
[('a', 3), ('a', 2), ('b', 5), ('c', 1)]

Getting the keys of a pair RDD in PySpark

To get the keys of the pair RDD as a list of strings:

rdd.keys().collect()
['a', 'a', 'b', 'c']

Note that if the RDD is not a pair RDD, then the values are returned:

rdd = sc.parallelize(["a","a","b","c"])
rdd.collect()
['a', 'a', 'b', 'c']
mail
Join our newsletter for updates on new DS/ML comprehensive guides (spam-free)
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
Ask a question or leave a feedback...
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!