access_time 3 years ago languageEnglish
Get the Current Spark Context Settings/Configurations
visibility 8,533 comment 0
In Spark, there are a number of settings/configurations you can specify including application properties and runtime parameters. https://spark.apache.org/docs/latest/configuration.html To retrieve all the current configurations, you can use the following code (Python): from pyspark.sql ...
info Last modified by Raymond 3 years ago
No comments yet.