Accumulators
Accumulators are shared variables across executors typically to add counters to your Spark program. If you have a Spark program and would like to know errors or total records processed or both, you can do it in two ways. One way is to add extra logic to just count errors or total records, which becomes complicated when handling all possible computations. The other way is to leave the logic and code flow fairly intact and add Accumulators.
Note
Accumulators can only be updated by adding to the value.
The following is an example of creating and using a long Accumulator using Spark Context and the longAccumulator
function to initialize a newly created accumulator variable to zero. As the accumulator is used inside the map transformation, the Accumulator is incremented. At the end of the operation, the Accumulator holds a value of 351.
scala> val acc1 = sc.longAccumulator("acc1")
acc1: org.apache.spark.util.LongAccumulator = LongAccumulator(id: 10355, name: Some(acc1), value: 0...