Spark SQL - Date and Timestamp Function

access_time 7 days ago visibility6 comment 0

Current date

Function current_date() or current_date can be used to return the current date at the start of query evaluation. 


spark-sql> select current_date();

spark-sql> select current_date;

*Brackets are optional for this function.

Current datetime

Function current_timestamp() or current_timestamp or now() can be used to return the current timestamp at the start of query evaluation.


spark-sql> select current_timestamp();
2021-01-09 17:03:51.586

spark-sql> select current_timestamp;
2021-01-09 17:04:09.158
spark-sql> select now();
2021-01-09 17:10:29.592

Add or subtract months

Function add_months(start_date, num_months) can be used to add or substract months from a date.


spark-sql> select add_months(DATE'2020-01-01',5) AS 5M_LATER, add_months(DATE'2020-01-01', -5) AS 5M_BEFORE;
2020-06-01      2019-08-01

As shown in the code snippet, to subtract months, just pass the second parameter as a negative integer. 

Add or subtract days

Use function date_add(start_date, num_days) and date_sub(start_date, num_days) to add and subtract days from a date.


spark-sql> select date_add(current_date, 5) 5D_LATER, date_sub(current_date, 5) 5D_BEFORE;
2021-01-14      2021-01-04

Date difference between two date

Function date_diff(end_date, start_date) returns the days difference between start and end date. 


spark-sql> select datediff(current_date, DATE'2021-01-01');
datediff(current_date(), DATE '2021-01-01')

Last day of month

Function last_day(date) returns the date of the last day in the same month as the input date.


spark-sql> select last_day(current_date);

First day of month

There is no function to directly return the first day of month but we can use multiple ways to derive it. For example, use function day(date) to return the day component from the date and then subtract it from the date to get the date of month start. 


spark-sql> select current_date - day(current_date) + 1;
date_add(date_sub(current_date(), day(current_date())), 1)

spark-sql> select add_months(last_day(current_date)+1, -1);
add_months(date_add(last_day(current_date()), 1), -1)

Day of year, month, week

Function day, dayofyear, dayofweek can be used to return the day number of the year, month or week.


spark-sql> select dayofyear(current_date), day(current_date), dayofweek(current_date);
dayofyear(current_date())       day(current_date())     dayofweek(current_date())
9       9       7
infoFor dayofweek function, 1 represents Sunday. 

Timestamp to date

Use cast function to convert timestamp to date.


spark-sql> select cast(current_timestamp as date);
CAST(current_timestamp() AS DATE)

Date to timestamp

Similarly, use cast function to convert timestamp to date.


spark-sql> select cast(current_date as timestamp);
CAST(current_date() AS TIMESTAMP)
2021-01-09 00:00:00

There are many other date or timestamp functions. Refer to Spark SQL official documentation for details. 

copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn or Twitter.

Want to publish your article on Kontext?

Learn more

More from Kontext

visibility 16
thumb_up 0
access_time 13 days ago

ROW_NUMBER in Spark assigns a unique sequential number (starting from 1) to each record based on the ordering of rows in each window partition. It is commonly used to deduplicate data. The following sample SQL uses ROW_NUMBER function without PARTITION BY clause: SELECT TXN.*, ROW_NUMBER() OVER ...

visibility 6
thumb_up 0
access_time 6 days ago

Unlike traditional RDBMS systems, Spark SQL supports complex types like array or map. There are a number of built-in functions to operate efficiently on array values. ArrayType columns can be created directly using array or array_repeat  function. The latter repeat one element multiple times ...

visibility 15
thumb_up 0
access_time 19 days ago

In Spark SQL, virtual table can be constructed directly from constant values (literals) using SELECT statement. spark-sql> SELECT StrColumn FROM VALUES ('abc'),('def'),('ghi') table1(StrColumn); StrColumn abc def ghi spark-sql> SELECT ID,StrColumn FROM VALUES ...