Summary

Summary

Once you have downloaded the software stated in prior pages:

JDK 1.8.x

Apache Spark

winutils.exe (for Hadoop on Windows)

Anaconda Python 3.7

Scala IDE (Eclipse with Scala plugin)

I will demonstrate the setup of the Apache Spark complete development environment with Python and Scala live in the class or on the videos:

https://www.youtube.com/watch?v=53RwaMsjLaU&feature=youtu.be

From this point forward, you shall have a working Apache Spark, complete with Python and Scala development IDE, Jupyter notebook with Python 3 and Scala kernel, and Standalone Eclipse Scala IDE

You are now ready for software application development with Python and Scala with Apache Spark

Last updated