search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark SQL Functions | concat method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark SQL Functions' concat(~) method concatenates string and array columns.

Parameters

1. *cols | string or Column

The columns to concat.

Return Value

A PySpark Column (pyspark.sql.column.Column).

Examples

Concatenating string-based columns in PySpark

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", "Wong"], ["Bob", "Marley"]], ["fname", "lname"])
df.show()
+-----+------+
|fname| lname|
+-----+------+
| Alex| Wong|
| Bob|Marley|
+-----+------+

To concatenate fname and lname:

import pyspark.sql.functions as F
df.select(F.concat("fname", "lname")).show()
+--------------------+
|concat(fname, lname)|
+--------------------+
| AlexWong|
| BobMarley|
+--------------------+

If you wanted to include a space between the two columns, you can use F.lit(" ") like so:

import pyspark.sql.functions as F
df.select(F.concat("fname", F.lit(" "), "lname")).show()
+-----------------------+
|concat(fname, , lname)|
+-----------------------+
| Alex Wong|
| Bob Marley|
+-----------------------+

F.lit(" ") is a Column object whose values are filled with " ".

You could also add an alias to the returned column like so:

df.select(F.concat("fname", "lname").alias("COMBINED NAME")).show()
+-------------+
|COMBINED NAME|
+-------------+
| AlexWong|
| BobMarley|
+-------------+

You could also pass in Column objects intead of column labels:

df.select(F.concat(df.fname, F.col("lname"))).show()
+--------------------+
|concat(fname, lname)|
+--------------------+
| AlexWong|
| BobMarley|
+--------------------+

Concatenating array-based columns in PySpark

Consider the following PySpark DataFrame:

df = spark.createDataFrame([[ [4,5], [6]], [ [7], [8,9] ]], ["A", "B"])
df.show()
+------+------+
| A| B|
+------+------+
|[4, 5]| [6]|
| [7]|[8, 9]|
+------+------+

To concatenate the arrays of each column:

import pyspark.sql.functions as F
df.select(F.concat("A", "B")).show()
+------------+
|concat(A, B)|
+------------+
| [4, 5, 6]|
| [7, 8, 9]|
+------------+
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!