Answers for "use a particular column in Aggregate pyspark"

0

pyspark groupby sum

from pyspark.sql import functions as func
prova_df.groupBy("order_item_order_id").agg(func.sum("order_item_subtotal")).show()
Posted by: Guest on February-26-2020

Code answers related to "use a particular column in Aggregate pyspark"

Python Answers by Framework

Browse Popular Code Answers by Language