This document provides an overview of installing and deploying Apache Spark, including: 1. Spark can be installed via prebuilt packages or by building from source. 2. Spark runs in local, standalone, YARN, or Mesos cluster modes and the SparkContext is used to connect to the cluster. 3. Jobs are deployed to the cluster using the spark-submit script which handles building jars and dependencies.