Is there an issue with sparkSQL or my SQL query is wrong?


#1

scala> sqlContext.sql("select * from products " +
| “group by product_category_id”)

18/01/24 17:05:22 INFO ParseDriver: Parsing command: select * from products group by product_category_id
18/01/24 17:05:22 INFO ParseDriver: Parse Completed
org.apache.spark.sql.AnalysisException: expression ‘product_id’ is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don’t care which value you get.;
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:38)
at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:44)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.org$apache$spark$sql$catalyst$analysis$CheckAnalysis$class$$anonfun$$checkValidAggregateExpression$1(CheckAnalysis.scala:130)

Thanks