Variables used to store the start and end time of the period of interest for the metrics report
Move data recorded from the custom listener into a DataFrame and register it as a view for easier processing
This inserts the custom Spark Listener into the live Spark Context
for internal metrics summ all the values, for the accumulables compute max value for eax accId and name
Custom aggreagations and post-processing of the metrics data
Shortcut to run and measure the metrics for Spark execution, built after spark.time()
Helper method to save data, we expect to have small amounts of data so collapsing to 1 partition seems OK