. Advertisement .
..3..
. Advertisement .
..4..
Depending on the OS (Mac, Linux, Windows, or CentOS), Spark installs in multiple areas, making it difficult to identify the Spark version. We are sometimes needed to determine what version of Apache Spark is present on our system.
This article will introduce to you the most feasible approaches to check Spark version. Wait for no more but jump right in for further information!
Spark Overview
Apache Spark is often known as a unified analytics engine for handling massive amounts of data.
It offers high-level APIs for Scala, Java, R, and Python as well as an engine that is efficient and supports all execution graphs.
Not just so, it also supports a wide range of advanced tools, such as pandas API on Spark for pandas workloads, GraphX for processing graphs, Spark SQL for processing SQL and structured data, MLlib for learning machine, and Structured Streaming for processing streams of data incrementally.
How To Check Spark Version?
Method #1. Check Spark Version Using Command Line
Version checking in Spark is not much different from other languages or tools. As such, you can employ –version option including spark-shell, spark-submit, and spark-sql to track down your demand.
spark-submit --version
spark-shell --version
spark-sql --version
The output seen below, which shows the Spark installed version, is the result of the aforementioned spark-submit, spark-shell, and spark-sql commands.
......... ADVERTISEMENT .........
..8..
As can be seen, it also shows the Java version and Scala version 2.12.10 along with the Spark version. Since I use OpenJDK for Java, the version displayed is OpenJDK 64-Bit Server VM, 11.0-13.
Method #2. Check Spark Version Using Any IDE or IntelliJ
Obtaining the version field from the SparkSession object, which gives back a String type, is another helpful approach if you are creating a Spark application and you need to know the spark version at runtime.
Running the code:
val spark = SparkSession.builder()
.master("local[1]")
.appName("ittutoria.net")
.getOrCreate();
print('Apache Spark Version :'+spark.version)
print('Apache Spark Version :'+spark.sparkContext.version)
Method #2. Check Spark Version Using Spark Shell
Alternatively, if you are already in Spark-shell and wish to know the Spark version without leaving Spark-shell, the sc.version command is indeed what’s right up your street.
That way, sc is a SparkContect variable that comes with spark-shell by default. All you have to get done then is to follow the instructions below and that’s how you locate the spark version.
- Step 1: cd to $SPARK_HOME/bin
- Step 2: Kick off spark-shell command
- Step 3: Enter sc.version or spark.version
Like a String type, sc.version returns the version. The spark.version command in the shell also produces the identical results.
The Bottom Line
Above are all that you would need to grasp more regarding how to check Spark version. I hope that this instruction can benefit you one way or another. See then!
Leave a comment