Spark and Scala Version
Any version of Spark requries a specific version Scala. When you build an application to be written Scala, you want to make sure yourScala version is compatible with Spark version you have.
For example,
Spark 2.4.5 is built and distributed to work with Scala 2.12 by default. (Spark can be built to work with other versions of Scala, too.) To write applications in Scala, you will need to use a compatible Scala version (e.g. 2.12.X).
You can see Spark and Scala versions by running spark-shell
1
(base) [email protected]:~/go/src/root$ $SPARK_HOME/bin/spark-shell
2
20/04/12 23:46:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
3
Spark context Web UI available at http://master.hadoop.lan:4045
4
Spark context available as 'sc' (master = local[*], app id = local-1586760391208).
5
Spark session available as 'spark'.
6
Welcome to
7
____ __
8
/ __/__ ___ _____/ /__
9
_\ \/ _ \/ _ `/ __/ '_/
10
/___/ .__/\_,_/_/ /_/\_\ version 2.4.4
11
/_/
12
​
13
Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_191)
14
Type in expressions to have them evaluated.
15
Type :help for more information.
16
​
17
scala>
18
​
Copied!
Last modified 1yr ago
Copy link