Convert string to date in Python / Spark

access_time 2 years ago visibility820 comment 0

This code snippet shows how to convert string to date.

In PySpark/Python, we can use unix_timestamp and from_unixtime functions.

Code snippet

from pyspark.sql import SparkSession
from pyspark.sql.functions import unix_timestamp, from_unixtime

appName = "PySpark Date Parse Example"
master = "local"

# Create Spark session with Hive supported.
spark = SparkSession.builder \
    .appName(appName) \
    .master(master) \
    .getOrCreate()

df = spark.createDataFrame([('2019-06-01',)], ['DATE_STR_COL'])
df.select(from_unixtime(unix_timestamp(df.DATE_STR_COL, 'yyyy-MM-dd')).alias('DATE_COL'))
df.show()
info Last modified by Raymond at 2 years ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Want to publish your article on Kontext?

Learn more

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.


Learn more arrow_forward

More from Kontext

local_offer spark local_offer hadoop local_offer pyspark local_offer oozie local_offer hue

visibility 3045
thumb_up 0
access_time 2 years ago

When submitting Spark applications to YARN cluster, two deploy modes can be used: client and cluster. For client mode (default), Spark driver runs on the machine that the Spark application was submitted while for cluster mode, the driver runs on a random node in a cluster. On this page, I am going ...

local_offer python local_offer spark local_offer pyspark local_offer spark-advanced

visibility 8590
thumb_up 0
access_time 2 years ago

For SQL developers that are familiar with SCD and merge statements, you may wonder how to implement the same in big data platforms, considering database or storages in Hadoop are not designed/optimised for record level updates and inserts. In this post, I’m going to demonstrate how to implement ...

local_offer spark local_offer pyspark

visibility 5485
thumb_up 0
access_time 2 years ago

When creating Spark date frame using schemas, you may encounter errors about “field **: **Type can not accept object ** in type <class '*'>”. The actual error can vary, for instances, the following are some examples: field xxx: BooleanType can not accept object 100 in type <class ...

About column

Code snippets for various programming languages/frameworks.

rss_feed Subscribe RSS