Convert String to Date using Spark SQL

access_time 2 months ago visibility141 comment 0

In Spark, function to_date can be used to convert string to date. This function is available since Spark 1.5.0.

Code snippet

SELECT to_date('2020-10-23', 'yyyy-MM-dd');
SELECT to_date('23Oct2020', 'ddMMMyyyy');

Datetime patterns

Refer to the official documentation about all the datetime patterns.

GeratextAD; Anno Domini
yyearyear2020; 20
M/Lmonth-of-yearmonth7; 07; Jul; July
Q/qquarter-of-yearnumber/text3; 03; Q3; 3rd quarter
Eday-of-weektextTue; Tuesday
Faligned day of week in monthnumber(1)3
hclock-hour-of-am-pm (1-12)number(2)12
Khour-of-am-pm (0-11)number(2)0
kclock-hour-of-day (1-24)number(2)0
Hhour-of-day (0-23)number(2)0
Vtime-zone IDzone-idAmerica/Los_Angeles; Z; -08:30
ztime-zone namezone-namePacific Standard Time; PST
Olocalized zone-offsetoffset-OGMT+8; GMT+08:00; UTC-08:00;
Xzone-offset ‘Z’ for zerooffset-XZ; -08; -0830; -08:30; -083015; -08:30:15;
xzone-offsetoffset-x+0000; -08; -0830; -08:30; -083015; -08:30:15;
Zzone-offsetoffset-Z+0000; -0800; -08:00;
escape for textdelimiter 
’‘single quoteliteral
[optional section start  
]optional section end  

copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Want to publish your article on Kontext?

Learn more

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.

Learn more arrow_forward

More from Kontext

local_offer teradata local_offer SQL

visibility 1380
thumb_up 1
access_time 2 years ago

This code snippet shows how to convert string to date in Teradata.

local_offer SQL Server local_offer spark local_offer hdfs local_offer parquet local_offer sqoop

visibility 3376
thumb_up 0
access_time 3 years ago

This page shows how to import data from SQL Server into Hadoop via Apache Sqoop. Please follow the link below to install Sqoop in your machine if you don’t have one environment ready. Install Apache Sqoop in Windows Use the following command in Command Prompt, you will be able to find out ...

local_offer tutorial local_offer pyspark local_offer spark local_offer how-to local_offer spark-dataframe

visibility 808
thumb_up 0
access_time 4 months ago

This article shows how to 'delete' column from Spark data frame using Python.  Follow article  Convert Python Dictionary List to PySpark DataFrame to construct a dataframe. +----------+---+------+ | Category| ID| Value| +----------+---+------+ |Category A| 1| 12.40| |Category B| ...

About column

Apache Spark installation guides, performance tuning tips, general tutorials, etc.

*Spark logo is a registered trademark of Apache Spark.

rss_feed Subscribe RSS