HiveServer2 Cannot Connect to Hive Metastore Resolutions/Workarounds

access_time 2 years ago visibility2526 comment 0

Since Hive 3.x, new authentication feature for HiveServer2 client is added. When starting HiveServer2 service (Hive version 3.0.0), you may encounter errors like:

‘HiveServer2 metastore.RetryingMetaStoreClient: RetryingMetaStoreClient trying reconnect as [username]  (auth:SIMPLE).

By looking into log details of Hive, you may find the following recommendation:

metastore.HiveMetaStore: Not authorized to make the get_current_notificationEventId call. You can try to disable metastore.metastore.event.db.notification.api.auth

By default, that configuration is configured as following:

       Should metastore do authorization against database notification related APIs such as get_next_notification.
       If set to true, then only the superusers in proxy settings have the permission

So to fix the issue, follow these two steps.

Step 1 - setup proxy settings

In Hadoop core-site.xml configuration file, add the following configurations:

     <value>*</value> </property>
     <value>*</value> </property>

Replace $superuser with your user account. For me, I am running as user fahao. Thus I’ve added the following configuration:

< /property>
< /property>

If I don’t add the above configuration, my user account won’t be able to impersonate hive user.

java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException( User: fahao is not allowed to impersonate hive (state=08S01,code=0)

Step 2 (optional) - Update Hive metadata configuration

Update your Hive configuration file hive-site.xml to include the following settings:

       Should metastore do authorization against database notification related APIs such as get_next_notification.
       If set to true, then only the superusers in proxy settings have the permission

Without this, you need to specify username when connecting to Hive using beeline:

beeline> !connect jdbc:hive2://localhost:10000/default -n username


If anonymous connection is allowed, User Name will show as anonymous in HiveServer2 web UI otherwise the connected username will show there.

After these changes, restart Hadoop and HiveServer2 services, the problem should be resolved.

$HIVE_HOME/bin/hive --service metastore &
$HIVE_HOME/bin/hive --service hiveserver2 &
$HIVE_HOME/bin/hiveserver2 &

Try connect to HiveServer2 via beeline CLI

Once the service starts succesfully, you can view HiveServer2 Web UI via the following URL:



You can use beeline to query the Hive databases:

beeline> !connect jdbc:hive2://localhost:10000/default
Connecting to jdbc:hive2://localhost:10000/default
Enter username for jdbc:hive2://localhost:10000/default: hive
Enter password for jdbc:hive2://localhost:10000/default:
Connected to: Apache Hive (version 3.0.0)
Driver: Hive JDBC (version 3.0.0)
0: jdbc:hive2://localhost:10000/default> show databases;
+----------------+ | database_name  | +----------------+ | default        | | test_db        | +----------------+ 2 rows selected (0.859 seconds)
local_offer lite-log local_offer hive
info Last modified by Raymond at 9 months ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Want to publish your article on Kontext?

Learn more

Kontext Column

Created for everyone to publish data, programming and cloud related articles.
Follow three steps to create your columns.

Learn more arrow_forward

More from Kontext

local_offer SQL local_offer hive

visibility 5218
thumb_up 1
access_time 12 months ago

In different databases, the syntax of selecting top N records are slightly different. They may also differ from ISO standards.

local_offer spark local_offer pyspark local_offer hive local_offer spark-database-connect

visibility 652
thumb_up 0
access_time 2 years ago

Form Spark 2.0, you can use Spark session builder to enable Hive support directly. The following example (Python) shows how to implement it. from pyspark.sql import SparkSession appName = "PySpark Hive Example" master = "local" # Create Spark session with Hive supported. spark = ...

local_offer SQL Server local_offer hive

visibility 2400
thumb_up 0
access_time 2 years ago

In one of my previous post, I showed how to configure Apache Hive 3.0.0 in Windows 10. Apache Hive 3.0.0 Installation on Windows 10 Step by Step Guide I didn’t configure Hive metastore thus by default Hive will use embedded mode for metastore. The metadata is stored in Apache Derby database.

About column

Articles about Apache Hadoop installation, performance tuning and general tutorials.

*The yellow elephant logo is a registered trademark of Apache Hadoop.

rss_feed Subscribe RSS