search
Search
Login
Math ML Join our weekly DS/ML newsletter
menu
menu search toc more_vert
Robocat
Guest 0reps
Thanks for the thanks!
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook

PySpark DataFrame | colRegex method

Machine Learning
chevron_right
PySpark
chevron_right
Documentation
chevron_right
PySpark DataFrame
schedule Jul 1, 2022
Last updated
local_offer PySpark
Tags

PySpark DataFrame's colRegex(~) method returns a Column object whose label match the specified regular expression. This method also allows multiple columns to be selected.

Parameters

1. colName | string

The regex to match the label of the columns.

Return Value

A PySpark Column.

Examples

Selecting columns using regular expression in PySpark

Consider the following PySpark DataFrame:

df = spark.createDataFrame([("Alex", 20), ("Bob", 30), ("Cathy", 40)], ["col1", "col2"])
df.show()
+-----+----+
| col1|col2|
+-----+----+
| Alex| 20|
| Bob| 30|
|Cathy| 40|
+-----+----+

To select columns using regular expression, use the colRegex(~) method:

df.select(df.colRegex("`col[123]`")).show()
+-----+----+
| col1|col2|
+-----+----+
| Alex| 20|
| Bob| 30|
|Cathy| 40|
+-----+----+

Here, note the following:

  • we wrapped the column label using backticks ` - this is required otherwise PySpark will throw an error.

  • the regular expression col[123] matches columns with label col1, col2 or col3.

  • the select(~) method is used to convert the Column object into a PySpark DataFrame.

Getting column labels that match regular expression as list of strings in PySpark

To get column labels as a list of strings instead of PySpark Column objects:

df.select(df.colRegex("`col[123]`")).columns
['col1', 'col2']

Here, we are using the columns property of the PySpark DataFrame returned by select(~).

mail
Join our newsletter for updates on new DS/ML comprehensive guides (spam-free)
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
thumb_up
2
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!