Answers for "pyspark dataframe to spark dataframe"

0

convert pandas dataframe to spark dataframe

import pandas as pd
from pyspark.sql import SparkSession

filename = <'path to file'>
spark = SparkSession.build.appName('pandasToSpark').getOrCreate()
# Assuming file is csv
pandas_df = pd.read_csv(filename)
spark_df = spark.CreateDataFrame(pandas_df)
Posted by: Guest on March-14-2021
-1

spark to pandas

pandas_df = some_df.toPandas()
Posted by: Guest on March-11-2021

Code answers related to "pyspark dataframe to spark dataframe"

Python Answers by Framework

Browse Popular Code Answers by Language