Entering Docker Container
Once the docker is running, you can show docker container info by:
docker ps
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
bb5da52bef7d spark_lab/spark:latest "/start-worker.sh" 18 minutes ago Up 18 minutes 0.0.0.0:32772->8080/tcp docker_dir_spark-worker_2
0a16636c3895 spark_lab/spark:latest "/start-worker.sh" 18 minutes ago Up 18 minutes 0.0.0.0:32773->8080/tcp docker_dir_spark-worker_1
9753b4389785 spark_lab/spark:latest "/start-master.sh" 18 minutes ago Up 18 minutes 0.0.0.0:7077->7077/tcp, 0.0.0.0:8080->8080/tcp spark-master
You can enter the container by:
docker exec -it [container id] bash
#You are in host machine
docker exec -it 9753b4389785 bash
#You are in container, exit to return to host
bash-4.4# ls
bin home mnt root spark start-worker.sh usr
dev lib opt run srv sys var
etc media proc sbin start-master.sh tmp
bash-4.4# spark/bin/spark-shell
20/04/23 01:18:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://spark-master:4040
Spark context available as 'sc' (master = local[*], app id = local-1587604698244).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.0.0-preview2
/_/
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_212)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
PreviousLaunch custom built Docker container with docker-composeNextSetup Hadoop, Hive and Spark on Linux without docker
Last updated