Basic Spark Package
If you are not running Scala under spark-shell, you are likely needing to import some basic Spark packages such as
1
import org.apache.spark.SparkContext
2
import org.apache.spark.SparkConf
Copied!
You will also need to define Spark conf and
1
val sparkConf = new SparkConf()
2
.setAppName("getTweets").setMaster("local[3]")
Copied!
Note, in this example, local[3] means your driver program runs on the local driver node only and to use up to 3 CPUs
You will need to create SparkContext, based on sparkConf you created earlier
1
val sc = new SparkContext(sparkConf)
Copied!
Note, if you start $SPSRK_HOME/bin/spark-shell, or use Spylon Kernel in jupyter-notebook, Spark Context sc is created for you automatically and you do not need to run
1
val sc = new SparkContext(sparkConf)
Copied!
​
​
Last modified 1yr ago
Copy link