Basic Spark Package

If you are not running Scala under spark-shell, you are likely needing to import some basic Spark packages such as

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

You will also need to define Spark conf and

val sparkConf = new SparkConf()
  .setAppName("getTweets").setMaster("local[3]")

Note, in this example, local[3] means your driver program runs on the local driver node only and to use up to 3 CPUs

You will need to create SparkContext, based on sparkConf you created earlier

val sc = new SparkContext(sparkConf)

Note, if you start $SPSRK_HOME/bin/spark-shell, or use Spylon Kernel in jupyter-notebook, Spark Context sc is created for you automatically and you do not need to run

val sc = new SparkContext(sparkConf)

Last updated