Add Sentry failure callback to custom python operator #479
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
PR Checklist
PR Structure
otherwise).
Thoroughness
What
Airflow was not passing failed task status to Sentry correctly for
del_ins_
tasks in thehistory_table_export
DAG. The team was not getting notified when the task would fail, requiring manual inspection in Airflow to know if such tasks failed.Why
I think the callback was defined too deep into the
del_ins
operator. There is an custom Python wrapper operator, calledbuild_del_ins_operator
that calls the actual operators that delete and insert data in BigQuery. The callback was set in the delete/insert operators, but not the Python wrapper. Adding to the wrapper to see if errors are sent to Sentry.Known limitations
Needs testing in test to confirm that it works correctly.