Answers for "spark group by alias"

0

spark group by alias

from pyspark.sql.functions import max

df.groupBy('column').agg(max("diff").alias("maxDiff"))
Posted by: Guest on January-18-2022

Python Answers by Framework

Browse Popular Code Answers by Language