Answers for "generate column aggregations in pyspark"

0

pyspark groupby sum

from pyspark.sql import functions as func
prova_df.groupBy("order_item_order_id").agg(func.sum("order_item_subtotal")).show()
Posted by: Guest on February-26-2020
1

Pyspark Aggregation on multiple columns

df.groupBy("year", "sex").agg(avg("percent"), count("*"))
Posted by: Guest on September-14-2020

Python Answers by Framework

Browse Popular Code Answers by Language