Spark Application Anatomy
This diagram depicts the relationships among Spark application, job, stage and task.
One Spark application can contain multiple actions and each action will be related to one Spark job; to run the computation within a job, multiple stages might be involved as some actions cannot be done within just one stage; each stage will include many tasks and the task count is decided by the total partitions in the RDD/DataFrame. Task is a lowest parallelism unit in Spark.
copyright
This page is subject to Site terms.
comment Comments
No comments yet.