Changes

Jump to: navigation, search

The Real A Team

43 bytes added, 17:20, 5 April 2016
Compare To CPP + Open MP
| Large || 157 || 50 || 14135
|}
 
[[Image:Graph_Spark_Vs_CPP.png|500px| ]]
 
What this shows is that the overhead of each parallelization technique is important to consider. Since scala+spark is meant to run across multiple computers, there is no optimization when running on a single machine, and as you can see by the chart above, the time taken to complete the program increases with the file size.

Navigation menu