Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redshift query abort without throwing exception in spark job #99

Open
hotstar-xia opened this issue Dec 31, 2021 · 1 comment
Open

Redshift query abort without throwing exception in spark job #99

hotstar-xia opened this issue Dec 31, 2021 · 1 comment

Comments

@hotstar-xia
Copy link

Hi, we are using the following command to unload some query results from redshift to s3.

      .option("url", s"${config.redshiftUrl}?user=${config.redshiftUser}&password=${config.redshiftPassword}")
      .option("query", sql)
      .option("tempdir", s"s3://${config.redshiftTempS3Bucket}/redshift/temp_data/download/$segId")
      .option("forward_spark_s3_credentials", "true")
      .load()

But everytime when redshift abort our query, the spark job is still running and didn't receive any exceptions or signals. Do you have any suggestions on exception handling?

@88manpreet
Copy link
Collaborator

That is most likely due to spark session not being closed.
I am assuming you created or using a spark session in the above code.
Could you try adding sc.stop() in your exception handling?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants