-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-51050] [SQL] Add group by alias tests to the group-by.sql #49750
base: master
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
@@ -48,6 +48,20 @@ SELECT COUNT(DISTINCT b), COUNT(DISTINCT b, c) FROM (SELECT 1 AS a, 2 AS b, 3 AS | |||||||||||||||||||||||||||
SELECT a AS k, COUNT(b) FROM testData GROUP BY k; | ||||||||||||||||||||||||||||
SELECT a AS k, COUNT(b) FROM testData GROUP BY k HAVING k > 1; | ||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- GROUP BY literal | ||||||||||||||||||||||||||||
SELECT a AS k FROM testData GROUP BY 'k'; | ||||||||||||||||||||||||||||
SELECT 1 AS k FROM testData GROUP BY 'k'; | ||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- GROUP BY alias with the function name | ||||||||||||||||||||||||||||
SELECT concat_ws(' ', a, b) FROM testData GROUP BY `concat_ws( , a, b)`; | ||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- GROUP BY column with name same as an alias used in the project list | ||||||||||||||||||||||||||||
SELECT 1 AS a FROM testData GROUP BY a; | ||||||||||||||||||||||||||||
SELECT 1 AS a FROM testData GROUP BY `a`; | ||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- GROUP BY implicit alias | ||||||||||||||||||||||||||||
SELECT 1 GROUP BY `1`; | ||||||||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- GROUP BY alias with invalid col in SELECT list | ||||||||||||||||||||||||||||
SELECT a AS k, COUNT(non_existing) FROM testData GROUP BY k; | ||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
|
@@ -64,6 +78,10 @@ set spark.sql.groupByAliases=false; | |||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- Check analysis exceptions | ||||||||||||||||||||||||||||
SELECT a AS k, COUNT(b) FROM testData GROUP BY k; | ||||||||||||||||||||||||||||
SELECT 1 GROUP BY `1`; | ||||||||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is a duplicate. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Idea was to add some tests that should fail (with |
||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- GROUP BY column with name same as an alias used in the project list | ||||||||||||||||||||||||||||
SELECT 1 AS a FROM testData GROUP BY `a`; | ||||||||||||||||||||||||||||
mihailoale-db marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||
-- Aggregate with empty input and non-empty GroupBy expressions. | ||||||||||||||||||||||||||||
SELECT a, COUNT(1) FROM testData WHERE false GROUP BY a; | ||||||||||||||||||||||||||||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please address this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Idea was to add some tests that should fail (with
set spark.sql.groupByAliases=false;
). I can remove them if needed