Changes

Jump to: navigation, search

GPU621/Apache Spark Fall 2022

4 bytes added, 04:49, 7 December 2022
no edit summary
===The implementation of Spark has the following steps===
*1. The SparkContext applies for computing resources from the Cluster Manager.*2. The Cluster Manager receives the request and start allocating the resources. (Creates and activates the executor on the worker node.)*3. The SparkContext sends the program/application code (jar package or python file, etc.) and task to the Executor. Executor executes the task and saves/returns the result*4. The SparkContext will collect the results.
==Apache Spark Core API==
10
edits

Navigation menu