Get the Current Spark Context Settings/Configurations

Raymond Tang Raymond Tang 0 15898 6.98 index 4/5/2019

In Spark, there are a number of settings/configurations you can specify including application properties and runtime parameters.

https://spark.apache.org/docs/latest/configuration.html

Get current configurations

To retrieve all the current configurations, you can use the following code (Python):

from pyspark.sql import SparkSession

appName = "PySpark Partition Example"
master = "local[8]"

# Create Spark session with Hive supported.
spark = SparkSession.builder \
    .appName(appName) \
    .master(master) \
    .getOrCreate()

configurations = spark.sparkContext.getConf().getAll()
for conf in configurations:
    print(conf)

* The above code is for Spark 2.0+ versions.

The output for the above code looks similar like the following:

('spark.rdd.compress', 'True') ('spark.app.name', 'PySpark Partition Example') ('spark.app.id', 'local-1554464117837') ('spark.master', 'local[8]') ('spark.serializer.objectStreamReset', '100') ('spark.executor.id', 'driver') ('spark.submit.deployMode', 'client') ('spark.driver.host', 'Raymond-Alienware') ('spark.driver.port', '11504') ('spark.ui.showConsoleProgress', 'true')

lite-log pyspark spark

Join the Discussion

View or add your thoughts below

Comments