I'm trying to SET hive.groupby.orderby.position.alias=true in py spark sql, but getting the error.
AnalysisException: u"expression 'Dealer_D' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;"
Which Spark version do you use, and could you post a short example of your SQL here?
I'm wondering the use case because both `spark.sql.groupByOrdinal` and `spark.sql.orderByOrdinal` are true by default in Spark.