# Run a program to estimate pi

### Run a program to estimate PI, which is 3.14592…

First Scala code with Apache Spark using Eclipse IDE

Start Eclipse Scala Plug-ins, click New->Scala Project

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1fdGdwxEoneHQS-NCN%2F-M1fdrUA-KKLYwQvZQKN%2Fscala-2.jpg?alt=media\&token=ba851a5f-1892-4bce-a604-2184765e219e)

Enter Spark, click Finish

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1feqsneI1maqydde4J%2Fscala-3.jpg?alt=media\&token=c0438a46-b3a2-4026-ac81-537eff48af05)

Right click Project name spark you just created, click New->Package

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fey5SUUWoyj8WQMY6%2Fscala-4.jpg?alt=media\&token=0b3f4367-c7fd-4ace-b172-a2b19f5a00f3)

Enter the name of Package: using reverse domain name, like below, click Finish

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1ff6Yw1T50qRWZ1VaV%2Fscala-5.jpg?alt=media\&token=f6aa6671-f4a1-48b6-9689-db8ecfb0a6cb)

Right mouse click Package name, select New->Scala Object, enter reverse domain name+name of object which is pi in this instance, click Finish

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1ffEDWsV1Z7Uk3IXP3%2Fscala-6.jpg?alt=media\&token=230d1ba0-30d3-4935-9258-0f390f42111f)

Copy below code after the package line, in my example, “package com.jentekco.spark”

```
import scala.math.random
import org.apache.spark._
import org.apache.log4j._
import org.apache.spark.sql.SparkSession
object pi {
    def main(args: Array[String]): Unit = {
    Logger.getLogger("org").setLevel(Level.ERROR)
    val spark = SparkSession
      .builder
      .master("local")
      .appName("Spark Pi")
      .getOrCreate()
    val slices = if (args.length > 0) args(0).toInt else 2
    val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
    val count = spark.sparkContext.parallelize(1 until n, slices).map { i =>
      val x = random * 2 - 1
      val y = random * 2 - 1
      if (x*x + y*y <= 1) 1 else 0
    }.reduce(_ + _)
    println(s"Pi is roughly ${4.0 * count / (n - 1)}")
    spark.stop()
  } }
```

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fg5IQ-D14UUuTzWFN%2Fscala-7.jpg?alt=media\&token=d4cde65c-544e-401a-a894-e1d4d4748f94)

To fix it, right click project spark, choose Properties,

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fgDebIhPKVtx67Cf4%2Fscala-8.jpg?alt=media\&token=98ac8ffe-399f-42b8-b403-a161cd435bfb)

Choose Java Build Path->Add External JARS

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fgOhPckoODLA97J51%2Fscala-9.jpg?alt=media\&token=b4f1ffde-2599-4b5f-b042-b073de6b0c93)

Navigate to the SPARK Home folder, jars sub folder, select all jar files, then click Open

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fgUk60v8mwSVCKYuT%2Fscala-10.jpg?alt=media\&token=0367981b-ad19-4562-8f18-852344688b93)

Then the errors go away, Scala code is compiled successfully.

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fgcwFehe1YVOgyLE6%2Fscala-11.jpg?alt=media\&token=d2cd3260-a212-4419-9fbc-f690917d5925)

To run the Scala pi program, click Run->Run Configuration.&#x20;

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M74_Jwl3Nb-KVkgJJoG%2F-M76IV9lQ1_QIP3w2g4-%2Fscala-20.jpg?alt=media\&token=627dc8cf-3836-470f-9416-8c0c6e4079d8)

Then click Java Application (Scala is Java at the end of day)

In the Name field, enter pi

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fgiVpPMZefZoNT_Sa%2Fscala-12.jpg?alt=media\&token=8dfcba96-ffb3-4d2b-8889-88d5d63c3af3)

in the Project field, enter Spark, in the Main class field, in my example, I enter com.jentekco.spark.pi, then click Run

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fh1TNMeYaHE4xH5vk%2Fscala-13.jpg?alt=media\&token=91fc52ce-3aaa-4023-af55-6040143e4f37)

See output in the Console: “Pi is roughly 3.144195720978605”

Now that you have done your development and test of the scala project, you need to make a jar file to be deployed to the production. Right mouse click on spark, select Export

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fhAFGCKyoX19ioYtH%2Fscala-14.jpg?alt=media\&token=b7d5ed5a-29b2-47cd-8850-5a74cac0d641)

Choose JAR file and click Next

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fhFEQ-GP9AM6p_TqS%2Fscala-15.jpg?alt=media\&token=6c208b81-10cb-4768-a29b-31f3735a8694)

Under the resource to export, select the right project, in this case, spark

![](https://2100080250-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M1PNTHVApkPePuMdTu3%2F-M1feQHbi2ftnbF6qRBY%2F-M1fhMl-KHbn8roRkakz%2Fscala-16.jpg?alt=media\&token=602abcb7-30ff-40f3-a953-48aa4b2ff616)

Choose proper folder to export pi.jar using Browse button, click Finish when done

You can run it on your PC which already has spark home, using spark-submit:

```
%SPARK_HOME%\bin\spark-submit --class com.jentekco.spark.pi pi.jar
```

Pi is roughly 3.14279571397857
