Spark SQL provides a number of functions to make date, timestamp and intervals from datetime parts like years, months, days, hours, minutes, seconds, etc.
Functions
The following table list the functions available to use.
| Function | Since version | Purpose |
|---|---|---|
make_date(year, month, day) |
3.0.0 | Create date from year, month and day fields |
make_dt_interval([days[, hours[, mins[, secs]]]]) |
3.2.0 | Create DayTimeIntervalType duration from days, hours, mins and seconds fields. |
make_timestamp(year, month, day, hour, min, sec[, timezone]) |
3.0.0 | Create timestamp from year, month, day, hour, min, sec and timezone fields. |
make_ym_interval([years[, months]]) |
3.2.0 | Create interval from year and month fields. |
make_interval([years[, months[, weeks[, days[, hours[, mins[, secs]]]]]]]) |
3.0.0 | Create interval from years, months, weeks, days, hours, mins and seconds. |
Code snippet
The following code snippet provides some examples of using these functions.
spark-sql> select make_date(2022,6,16);
2022-06-16
spark-sql> select make_dt_interval(20,2,30,40);
20 02:30:40.000000000
spark-sql> select make_timestamp(2022,6,16,13,4,24,'Australia/Melbourne');
2022-06-16 13:04:24
spark-sql> select make_timestamp(2022,6,16,13,4,24,'Australia/Sydney');
2022-06-16 13:04:24
spark-sql> select make_ym_interval(10,9);
10-9
spark-sql> select make_interval(5,4,3,2,1,10,33);
5 years 4 months 23 days 1 hours 10 minutes 33 seconds