Show Headings (Column Names) in spark-sql CLI Result

access_time 20 days ago visibility19 comment 0

In Spark-SQL CLI tool, the result print will omit headings (column names) by default. To display columns, we need to update Spark setting spark.hadoop.hive.cli.print.header.

Update Spark defaults file

To make the changes for all spark-sql sessions, edit file $SPARK_HOME/conf/spark-defaults.conf.

Add the following line into the file:

spark.hadoop.hive.cli.print.header true

Pass config value when starting session

If you only want to make this config effective for the session, start spark-sql CLI with the following argument:

spark-sql --conf "spark.hadoop.hive.cli.print.header=true"
copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn or Twitter.

Want to publish your article on Kontext?

Learn more

More from Kontext

visibility 6
thumb_up 0
access_time 7 days ago

Unlike traditional RDBMS systems, Spark SQL supports complex types like array or map. There are a number of built-in functions to operate efficiently on array values. ArrayType columns can be created directly using array or array_repeat  function. The latter repeat one element multiple times ...

visibility 9
thumb_up 0
access_time 7 days ago

JSON string values can be extracted using built-in Spark functions like get_json_object or json_tuple.  Values can be extracted using get_json_object function. The function has two parameters: json_txt and path. The first is the JSON text itself, for example a string column in your Spark ...

visibility 26
thumb_up 0
access_time 29 days ago

In my article Connect to Teradata database through Python , I demonstrated about how to use Teradata python package or Teradata ODBC driver to connect to Teradata. In this article, I’m going to show you how to connect to Teradata through JDBC drivers so that you can load data directly into Spark ...