access_time 2 years ago languageEnglish
more_vert

Pass Environment Variables to Executors in PySpark

visibility 1,971 comment 0

Sometime it is necessary to pass environment variables to Spark executors. To pass environment variable to executors, use setExecutorEnv function of SparkConf class.

Code snippet

In the following code snippet, an environment variable name ENV_NAME is set up with value as 'ENV_Value'.

from pyspark import SparkConf
from pyspark.sql import SparkSession

appName = "Python Example - Pass Environment Variable to Executors"
master = 'yarn'

# Create Spark session
conf = SparkConf().setMaster(master).setAppName(
    appName).setExecutorEnv('ENV_NAME', 'ENV_Value')

spark = SparkSession.builder.config(conf=conf) \
    .getOrCreate()
copyright This page is subject to Site terms.
Like this article?
Share on

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

Follow Kontext

Get our latest updates on LinkedIn.

Want to contribute on Kontext to help others?

Learn more

More from Kontext

visibility 863
thumb_up 0
access_time 7 months ago