Load JSON File into Teradata via BTEQ

access_time 3 months ago visibility105 comment 0

BTEQ is a Teradata utility tool that can be used to run Teradata SQL statements incl. DDL, DML, etc. It can also be used to import data from text file into Teradata databases. It also works with XML and JSON files too. Like TPT and FASTLOAD, it can run in both batch and interactive modes.

This article provides examples about how to load JSON file into Teradata.

Sample JSON file

Create a sample JSON file named sample.json with the following content:

{"menu": {
  "id": "file",
  "value": "File",
  "popup": {
    "menuitem": [
      {"value": "New", "onclick": "CreateNewDoc()"},
      {"value": "Open", "onclick": "OpenDoc()"},
      {"value": "Close", "onclick": "CloseDoc()"}
    ]
  }
}}

Sample table

Use the following statement to create the target table:

CREATE TABLE TestDb.BTEQ_JSON_LOAD
(ID INT, JSON_DOC JSON) 
PRIMARY INDEX(ID);

Create a CSV index file

Create a CSV file named json-docs.csv with the following content:

sample.json,1

Create BTEQ script file 

Create BTEQ script file named load-json.bteq with the following content:

LOGON 192.168.119.128/DBC,DBC;
DATABASE TestDb;
.IMPORT VARTEXT ',' DEFERCOLS=1 FILE='json-docs.csv';
USING (JSON_DOC CLOB AS DEFERRED, ID VARCHAR(10))
INSERT INTO BTEQ_JSON_LOAD (CAST(:ID AS INTEGER), :JSON_DOC);
LOGOFF ;

The above script reads from json-docs.csv file and then use the first attribute as JSON file names to create deferred objects.

infoAs JSON data type is not supported in BTEQ, we need to define the column as CLOB.  There are limitation about JSON content size in Teradata: for UNICODE, the maximum length is 8388096 characters; for LATIN, the maximum length is 16776192 characters.

Run the script file

Run the following command to execute above BTEQ script:

bteq < load-json.bteq

Output:

bteq < load-json.bteq
BTEQ 16.10.00.02 Mon Sep 14 17:19:08 2020 PID: 14424

+---------+---------+---------+---------+---------+---------+---------+----
LOGON 192.168.119.128/DBC,

 *** Logon successfully completed.
 *** Teradata Database Release is 16.10.00.03
 *** Teradata Database Version is 16.10.00.03
 *** Transaction Semantics are BTET.
 *** Session Character Set Name is 'ASCII'.

 *** Total elapsed time was 1 second.

+---------+---------+---------+---------+---------+---------+---------+----
DATABASE TestDb;

 *** New default database accepted.
 *** Total elapsed time was 1 second.


+---------+---------+---------+---------+---------+---------+---------+----
.IMPORT VARTEXT ',' DEFERCOLS=1 FILE='json-docs.csv';
+---------+---------+---------+---------+---------+---------+---------+----
USING (JSON_DOC CLOB AS DEFERRED, ID VARCHAR(10))
INSERT INTO BTEQ_JSON_LOAD (CAST(:ID AS INTEGER), :JSON_DOC);

 *** Insert completed. One row added.
 *** Total elapsed time was 1 second.


+---------+---------+---------+---------+---------+---------+---------+----
LOGOFF ;
 *** You are now logged off from the DBC.
 *** BTEQ exiting due to EOF on stdin.
 *** Exiting BTEQ...
 *** RC (return code) = 0

Verify the data

Run the following SQL statement to verify the records:

SELECT	* 
FROM	 TestDb.BTEQ_JSON_LOAD;

The output looks like the following:


References

Query JSON data

Refer to page Extract JSON Data via SQL Functions in Teradata to find out details about how to extract JSON data using Teradata built-in functions.

More details about BTEQ IMPORT command

info Last modified by Administrator at 3 months ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Want to publish your article on Kontext?

Learn more

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.


Learn more arrow_forward

More from Kontext

local_offer teradata local_offer teradata-utilities

visibility 50
thumb_up 0
access_time 3 months ago

BTEQ is a Teradata utility tool that can be used to run Teradata SQL statements incl. DDL, DML, etc. It can also be used to import data from text file into Teradata databases. Like TPT and FASTLOAD, it can run in both batch and interactive modes. This article demonstrates how to load XML file into ...

local_offer teradata

visibility 638
thumb_up 0
access_time 4 years ago

This post summarizes the tutorials I created in the past about Teradata. It is trying to help you gain the basic SQL skills with Teradata especially if you are a new beginner or you just move from SQL Server or Oracle. It also includes some guides about setting up your own Teradata server (latest Teradata Express version TDExpress16.10.00.03_Sles11_40GB.7z).

local_offer mssql local_offer teradata local_offer teradata-sql-query

visibility 3866
thumb_up 0
access_time 6 years ago

In this serial, I will compare Teradata SQL with T-SQL with samples provided. This is mainly prepared for SQL Server DBAs, Developers and other users to help them quickly master the common used SQLs in Teradata platform. Similar to T-SQL, Teradata SQL implements most part of the ANSI SQL with ...

About column

Tutorials and information about Teradata.

rss_feed Subscribe RSS