Find Number of Rows of Hive Table via Scala

Kontext Kontext event 2022-08-23 visibility 625
more_vert
Find Number of Rows of Hive Table via Scala

Code description

To find the number of rows/records in a Hive table, we can use Spark SQL count aggregation function: Hive SQL - Aggregate Functions Overview with Examples.

This code snippet provide example of Scala code to implement the same. spark-shell is used directly for simplicity.  The code snippet can also run Jupyter Notebooks or Zeppelin with Spark kernel. Alternatively, it can be compiled to jar file and then submit as job via spark-submit

2022082315649-image.png

Code snippet

val sql="select count(*) from test_db.test_table;";
//sql: String = select count(*) from test_db.test_table;

val df = spark.sql(sql);
//df: org.apache.spark.sql.DataFrame = [count(1): bigint]

df.show()
/*
+--------+
|count(1)|
+--------+
|       5|
+--------+
*/

val df1 = spark.sql("select * from test_db.test_table;");
//df1: org.apache.spark.sql.DataFrame = [id: int, attr: string]

df1.groupBy().count()
//res5: org.apache.spark.sql.DataFrame = [count: bigint]

println(df1.groupBy().count())
//[count: bigint]

df1.groupBy().count().show()
/*+-----+
|count|
+-----+
|    5|
+-----+
*/
More from Kontext
comment Comments
No comments yet.

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts