Answers for "pyspark check current hadoop version"

0

pyspark check current hadoop version

from pyspark.sql import SparkSession
spark = SparkSession.getOrCreate()
sc = spark.sparkContext
print(f"Hadoop version = {sc._jvm.org.apache.hadoop.util.VersionInfo.getVersion()}")
Posted by: Guest on February-11-2021

Browse Popular Code Answers by Language