Changes

Jump to: navigation, search

GPU621/Apache Spark Fall 2022

448 bytes added, 03:46, 7 December 2022
no edit summary
===Work Flow Chart===
[[File:Cluster-overview.png]]
===The implementation of Spark requires the following Components===
====1.Driver Program (SparkContext)====
SparkContext is the main entry point for all Spark functions.
====2.Cluster Manager====
The cluster manager is used for resource management of applications.
====3. Worker node====
Work nodes are used to submit tasks to executors, report executor status information, cpu and memory information to the cluster manager.
====4. Executor====
 
 
10
edits

Navigation menu