Basic Spark Package
If you are not running Scala under spark-shell, you are likely needing to import some basic Spark packages such as
You will also need to define Spark conf and
Note, in this example, local[3] means your driver program runs on the local driver node only and to use up to 3 CPUs
You will need to create SparkContext, based on sparkConf you created earlier
Note, if you start $SPSRK_HOME/bin/spark-shell, or use Spylon Kernel in jupyter-notebook, Spark Context sc is created for you automatically and you do not need to run
Last updated