access_time 3 years ago languageEnglish
more_vert

Get the Current Spark Context Settings/Configurations

visibility 7,943 comment 0

In Spark, there are a number of settings/configurations you can specify including application properties and runtime parameters.

https://spark.apache.org/docs/latest/configuration.html

Get current configurations

To retrieve all the current configurations, you can use the following code (Python):

from pyspark.sql import SparkSession

appName = "PySpark Partition Example"
master = "local[8]"

# Create Spark session with Hive supported.
spark = SparkSession.builder \
    .appName(appName) \
    .master(master) \
    .getOrCreate()

configurations = spark.sparkContext.getConf().getAll()
for conf in configurations:
    print(conf)

* The above code is for Spark 2.0+ versions.

The output for the above code looks similar like the following:

('spark.rdd.compress', 'True')
('spark.app.name', 'PySpark Partition Example')
('spark.app.id', 'local-1554464117837')
('spark.master', 'local[8]')
('spark.serializer.objectStreamReset', '100')
('spark.executor.id', 'driver')
('spark.submit.deployMode', 'client')
('spark.driver.host', 'Raymond-Alienware')
('spark.driver.port', '11504')
('spark.ui.showConsoleProgress', 'true')

info Last modified by Raymond 3 years ago copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn.

Want to contribute on Kontext to help others?

Learn more

More from Kontext