search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark SQL Functions | to_date method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark SQL Functions' to_date() method converts date strings to date types.

Parameters

1. col | Column

The date string column.

2. format | string

The format of the date string.

Return Value

A PySpark Column.

Examples

Consider the following PySpark DataFrame with some date strings:

df = spark.createDataFrame([["Alex", "1995-12-16"], ["Bob", "1998-05-06"]], ["name", "birthday"])
df.show()
+----+----------+
|name| birthday|
+----+----------+
|Alex|1995-12-16|
| Bob|1998-05-06|
+----+----------+

Converting date strings to date type in PySpark

To convert date strings in the birthday column to actual date type, use to_date(~) and specify the pattern of the date string:

from pyspark.sql import functions as F
df_new = df.withColumn("birthday", F.to_date(df["birthday"], "yyyy-MM-dd"))
df_new.printSchema()
root
|-- name: string (nullable = true)
|-- birthday: date (nullable = true)

Here, the withColumn(~) method is used to update the birthday column using the new column returned by to_date(~).

As another example, here's a PySpark DataFrame with slightly more complicated date strings:

df = spark.createDataFrame([["Alex", "1995/12/16 16:20:20"], ["Bob", "1998/05/06 18:56:10"]], ["name", "birthday"])
df.show()
+----+----------+
|name| birthday|
+----+----------+
|Alex|1995-12-16|
| Bob|1998-05-06|
+----+----------+

Here, our date strings also contain hours, minutes and seconds.

To convert the birthday column to date type:

df_new = df.withColumn("birthday", F.to_date(df["birthday"], "yyyy/MM/dd HH:mm:ss"))
df_new.show()
+----+----------+
|name| birthday|
+----+----------+
|Alex|1995-12-16|
| Bob|1998-05-06|
+----+----------+

Here, notice how information about the hours, minutes and seconds unit have been lost during the type conversion.

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!