Answers for "pyspark dataframe to pandas dataframe"

0

convert pandas dataframe to spark dataframe

import pandas as pd
from pyspark.sql import SparkSession

filename = <'path to file'>
spark = SparkSession.build.appName('pandasToSpark').getOrCreate()
# Assuming file is csv
pandas_df = pd.read_csv(filename)
spark_df = spark.CreateDataFrame(pandas_df)
Posted by: Guest on March-14-2021
0

dataframe pandas to spark

from pyspark.sql import SparkSession
#Create PySpark SparkSession
spark = SparkSession.builder 
    .master("local[1]") 
    .appName("SparkByExamples.com") 
    .getOrCreate()
#Create PySpark DataFrame from Pandas
sparkDF=spark.createDataFrame(pandasDF) 
sparkDF.printSchema()
sparkDF.show()

#Outputs below schema & DataFrame

root
 |-- Name: string (nullable = true)
 |-- Age: long (nullable = true)

+------+---+
|  Name|Age|
+------+---+
| Scott| 50|
|  Jeff| 45|
|Thomas| 54|
|   Ann| 34|
+------+---+
Posted by: Guest on October-07-2021

Code answers related to "pyspark dataframe to pandas dataframe"

Python Answers by Framework

Browse Popular Code Answers by Language