Diagram
Raymond
visibility 77
access_time 10 months ago
language English
Spark Application Anatomy
copyright
This page is subject to Site terms.
comment Comments
No comments yet.
Log in with external accounts
tag
Tags
info Info
info Info
Image URL
SVG URL
This diagram depicts the relationships among Spark application, job, stage and task.
One Spark application can contain multiple actions and each action will be related to one Spark job; to run the computation within a job, multiple stages might be involved as some actions cannot be done within just one stage; each stage will include many tasks and the task count is decided by the total partitions in the RDD/DataFrame. Task is a lowest parallelism unit in Spark.