Stopping a Spark cluster
Once we are done developing on our cluster, it is ideal to shut it down and preserve resources.
How to do it...
This section walks through the steps to stop the SparkSession
.
- Execute the following script:
spark.stop()
- Confirm that the session has closed by executing the following script:
sc.master
How it works...
This section explains how to confirm that a Spark cluster has been shut down.
- If the cluster has been shut down, you will receive the error message seen in the following screenshot when executing another Spark command in the notebook:

There's more...
Shutting down Spark clusters may not be as critical when working in a local environment; however, it will prove costly when Spark is deployed in a cloud environment where you are charged for compute power.