Load JSON File into Teradata via BTEQ

access_time 8 days ago visibility6 comment 0

BTEQ is a Teradata utility tool that can be used to run Teradata SQL statements incl. DDL, DML, etc. It can also be used to import data from text file into Teradata databases. It also works with XML and JSON files too. Like TPT and FASTLOAD, it can run in both batch and interactive modes.

This article provides examples about how to load JSON file into Teradata.

Sample JSON file

Create a sample JSON file named sample.json with the following content:

{"menu": {
  "id": "file",
  "value": "File",
  "popup": {
    "menuitem": [
      {"value": "New", "onclick": "CreateNewDoc()"},
      {"value": "Open", "onclick": "OpenDoc()"},
      {"value": "Close", "onclick": "CloseDoc()"}
    ]
  }
}}

Sample table

Use the following statement to create the target table:

CREATE TABLE TestDb.BTEQ_JSON_LOAD
(ID INT, JSON_DOC JSON) 
PRIMARY INDEX(ID);

Create a CSV index file

Create a CSV file named json-docs.csv with the following content:

sample.json,1

Create BTEQ script file 

Create BTEQ script file named load-json.bteq with the following content:

LOGON 192.168.119.128/DBC,DBC;
DATABASE TestDb;
.IMPORT VARTEXT ',' DEFERCOLS=1 FILE='json-docs.csv';
USING (JSON_DOC CLOB AS DEFERRED, ID VARCHAR(10))
INSERT INTO BTEQ_JSON_LOAD (CAST(:ID AS INTEGER), :JSON_DOC);
LOGOFF ;

The above script reads from json-docs.csv file and then use the first attribute as JSON file names to create deferred objects.

infoAs JSON data type is not supported in BTEQ, we need to define the column as CLOB.  There are limitation about JSON content size in Teradata: for UNICODE, the maximum length is 8388096 characters; for LATIN, the maximum length is 16776192 characters.

Run the script file

Run the following command to execute above BTEQ script:

bteq < load-json.bteq

Output:

bteq < load-json.bteq
BTEQ 16.10.00.02 Mon Sep 14 17:19:08 2020 PID: 14424

+---------+---------+---------+---------+---------+---------+---------+----
LOGON 192.168.119.128/DBC,

 *** Logon successfully completed.
 *** Teradata Database Release is 16.10.00.03
 *** Teradata Database Version is 16.10.00.03
 *** Transaction Semantics are BTET.
 *** Session Character Set Name is 'ASCII'.

 *** Total elapsed time was 1 second.

+---------+---------+---------+---------+---------+---------+---------+----
DATABASE TestDb;

 *** New default database accepted.
 *** Total elapsed time was 1 second.


+---------+---------+---------+---------+---------+---------+---------+----
.IMPORT VARTEXT ',' DEFERCOLS=1 FILE='json-docs.csv';
+---------+---------+---------+---------+---------+---------+---------+----
USING (JSON_DOC CLOB AS DEFERRED, ID VARCHAR(10))
INSERT INTO BTEQ_JSON_LOAD (CAST(:ID AS INTEGER), :JSON_DOC);

 *** Insert completed. One row added.
 *** Total elapsed time was 1 second.


+---------+---------+---------+---------+---------+---------+---------+----
LOGOFF ;
 *** You are now logged off from the DBC.
 *** BTEQ exiting due to EOF on stdin.
 *** Exiting BTEQ...
 *** RC (return code) = 0

Verify the data

Run the following SQL statement to verify the records:

SELECT	* 
FROM	 TestDb.BTEQ_JSON_LOAD;

The output looks like the following:


References

Query JSON data

Refer to page Extract JSON Data via SQL Functions in Teradata to find out details about how to extract JSON data using Teradata built-in functions.

More details about BTEQ IMPORT command

info Last modified by Administrator at 3 days ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.


Learn more arrow_forward

More from Kontext

local_offer teradata local_offer teradata-utilities

visibility 12
thumb_up 0
access_time 8 days ago

BTEQ is a Teradata utility tool that can be used to run Teradata SQL statements incl. DDL, DML, etc. It can also be used to import data from text file into Teradata databases. Like TPT and FASTLOAD, it can run in both batch and interactive modes. This article demonstrates how to load XML fi...

local_offer teradata local_offer SQL

visibility 136
thumb_up 0
access_time 6 months ago

In SQL Server, we can use TRUNCATE statement to clear all the records in a table and it usually performs better compared with DELETE statements as no transaction log for each individual row deletion. The syntax looks like the following: TRUNCATE TABLE { database_name.schema_name.tab...

local_offer teradata local_offer C# local_offer .NET

visibility 7747
thumb_up 0
access_time 6 years ago

In this post, I will demonstrate how to connect to Teradata database via .NET Data Provider for Teradata using C#. Prerequisites Install the .NET Data Provider for Teradata from the following link: ...

About column

Teradata

Tutorials and information about Teradata.

rss_feed Subscribe RSS