Pass Environment Variables to Executors in PySpark
more_vert
Sometime it is necessary to pass environment variables to Spark executors. To pass environment variable to executors, use setExecutorEnv function of SparkConf class.
Code snippet
In the following code snippet, an environment variable name ENV_NAME is set up with value as 'ENV_Value'.
from pyspark import SparkConf from pyspark.sql import SparkSession appName = "Python Example - Pass Environment Variable to Executors" master = 'yarn' # Create Spark session conf = SparkConf().setMaster(master).setAppName( appName).setExecutorEnv('ENV_NAME', 'ENV_Value') spark = SparkSession.builder.config(conf=conf) \ .getOrCreate()
copyright
This page is subject to Site terms.
Log in with external accounts
warning Please login first to view stats information.
article
PySpark - Read Data from MariaDB Database
code
PySpark DataFrame - Convert JSON Column to Row using json_tuple
article
Fix - TypeError: an integer is required (got type bytes)
article
PySpark - 转换Python数组或串列为Spark DataFrame
article
Differences between spark.sql.shuffle.partitions and spark.default.parallelism
Read more (115)