search
Search
Join our weekly DS/ML newsletter layers DS/ML Guides
menu
menu search toc more_vert
Robocat
Guest 0reps
Thanks for the thanks!
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
help Ask a question
Share on Twitter
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to
A
A
brightness_medium
share
arrow_backShare
Twitter
Facebook

PySpark SQL Functions | lit method

Machine Learning
chevron_right
PySpark
chevron_right
Documentation
chevron_right
PySpark SQL Functions
schedule Jul 1, 2022
Last updated
local_offer PySpark
Tags

PySpark SQL Functions' lit(~) method creates a Column object with the specified value.

Parameters

1. col | value

A value to fill the column.

Return Value

A Column object.

Examples

Consider the following PySpark DataFrame:

df = spark.createDataFrame([["Alex", 20], ["Bob", 30]], ["name", "age"])
df.show()
+----+---+
|name|age|
+----+---+
|Alex| 20|
| Bob| 30|
+----+---+

Creating a column of constants in PySpark DataFrame

To create a new PySpark DataFrame with the name column of df and a new column called is_single made up of True values:

import pyspark.sql.functions as F
df2 = df.select(F.col("name"), F.lit(True).alias("is_single"))
df2.show()
+----+---------+
|name|is_single|
+----+---------+
|Alex| true|
| Bob| true|
+----+---------+

Here, F.lit(True) returns a Column object, which has a method called alias(~) that assigns a label.

Note that you could append a new column of constants using the withColumn(~) method:

import pyspark.sql.functions as F
df = spark.createDataFrame([["Alex", 20], ["Bob", 30]], ["name", "age"])
df_new = df.withColumn("is_single", F.lit(True))
df_new.show()
+----+---+---------+
|name|age|is_single|
+----+---+---------+
|Alex| 20| true|
| Bob| 30| true|
+----+---+---------+

Creating a column whose values are based on a condition in PySpark

We can also use lit(~) to create a column whose values depend on some condition:

import pyspark.sql.functions as F
col = df.when(F.col("age") <= 20, F.lit("junior")).otherwise(F.lit("senior"))
df3 = df.withColumn("status", col)
df3.show()
+----+---+------+
|name|age|status|
+----+---+------+
|Alex| 20|junior|
| Bob| 30|senior|
+----+---+------+

Note the following:

  • we are using the when(~) and otherwise(~) pattern to fill the values of the column conditionally.

  • we are using the withColumn(~) method to append a new column named status.

  • the F.lit("junior") can actually be replaced by "junior" - this is just to demonstrate one usage of lit(~).

mail
Join our newsletter for updates on new DS/ML comprehensive guides (spam-free)
robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_down
0
thumb_down
0
chat_bubble_outline
0
settings
Enjoy our search
Hit / to insta-search docs and recipes!