Connect to Hive via HiveServer2 JDBC Driver

access_time 2 years ago visibility2324 comment 0

This post shows you how to connect to HiveServer2 via Hive JDBC driver in Java.

*The way to connect to HiveServer1 is very similar though the driver names are different:

Version Driver Name
HiveServer1 org.apache.hadoop.hive.jdbc.HiveDriver
HiveServer2 org.apache.hive.jdbc.HiveDriver

The JDBC connection URL will also be different:

Version Driver Name
HiveServer1 jdbc:hive://…
HiveServer2 jdbc:hive2://…

Add dependency

First we need to add dependency in your project pom file (if you are using Maven):

<project xmlns="http://maven.apache.org/POM/4.0.0"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
     <modelVersion>4.0.0</modelVersion>
     <groupId>kontext.tech</groupId>
     <artifactId>examples</artifactId>
     <version>0.0.1-SNAPSHOT</version>
     <dependencies>
        <dependency>
             <groupId>org.apache.hive</groupId>
             <artifactId>hive-jdbc</artifactId>
             <version>3.1.1</version>
         </dependency>
             <groupId>org.apache.hive</groupId>
             <artifactId>hive-jdbc</artifactId>
             <version>3.1.1</version>
         </dependency>
     </dependencies> </project>

Or you need to directly reference Hive JAR file.

Create a Java class to connect to HiveServer2

Now, we can connect to HiveServer2 as you can do with any other supported databases.

The following is a sample class for your reference:

package examples;
import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.ResultSetMetaData; import java.sql.SQLException; import java.sql.Statement;
public class HiveJdbcExample {
    /**
      * HiveServer2 JDBC driver name
      */
     private static String driverName = "org.apache.hive.jdbc.HiveDriver";
    public static void main(String[] args) throws SQLException {
         try {
             Class.forName(driverName);
         } catch (ClassNotFoundException e) {
             e.printStackTrace();
             System.exit(1);
         }
        Connection conn = DriverManager.getConnection("jdbc:hive2://localhost:10000", "hive", "");
         Statement stmt = conn.createStatement();
         // show tables
         String sql = "SHOW databases";
         System.out.println("Running: " + sql);
         ResultSet rs = stmt.executeQuery(sql);
         while (rs.next()) {
             System.out.println(rs.getString(1));
         }
         rs.close();
        String sql2 = "select * from test_db.test_table";
         System.out.println("Running: " + sql2);
         ResultSet rs2 = stmt.executeQuery(sql2);
        ResultSetMetaData rsmd = rs2.getMetaData();
         int columnsNumber = rsmd.getColumnCount();
         while (rs2.next()) {
             for (int i = 1; i <= columnsNumber; i++) {
                 if (i > 1)
                     System.out.print(",  ");
                 String columnValue = rs2.getString(i);
                 System.out.print(rsmd.getColumnName(i) + " " + columnValue);
             }
             System.out.println("");
         }
        rs2.close();
        conn.close();
     }
}

For detailed information about connection URLs (as HiveServer2 service can run in different modes), please refer to the following page published by Apache:

https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-ConnectionURLs

Sample output

image

In HiveServer2 Web UI, we can also find the query history:

image

Clicking Drilldown link, you can also view the query detailed information:

image

local_offer Java local_offer lite-log local_offer hive
info Last modified by Raymond at 8 months ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.


Learn more arrow_forward

More from Kontext

local_offer hive

visibility 2244
thumb_up 0
access_time 2 years ago

Since Hive 3.x, new authentication feature for HiveServer2 client is added. When starting HiveServer2 service (Hive version 3.0.0), you may encounter errors like: ‘HiveServer2 metastore.RetryingMetaStoreClient: RetryingMetaStoreClient trying reconnect as [username]  (auth:SIMPLE). By ...

Schema Merging (Evolution) with Parquet in Spark and Hive

local_offer parquet local_offer pyspark local_offer spark-2-x local_offer hive local_offer hdfs local_offer spark-advanced

visibility 4368
thumb_up 1
access_time 8 months ago

Schema evolution is supported by many frameworks or data serialization systems such as Avro, Orc, Protocol Buffer and Parquet. With schema evolution, one set of data can be stored in multiple files with different but compatible schema. In Spark, Parquet data source can detect and merge schema of ...

local_offer sqlite local_offer python local_offer Java local_offer python-database

visibility 176
thumb_up 0
access_time 6 months ago

To read data from SQLite database in Python, you can use the built-in sqlite3 package . Another approach is to use SQLite JDBC driver via  JayDeBeApi  python package. Download the JAR file from one of the online repositories: Maven Repository BitBucket or any other equivalent ...

About column