Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue #563: Use weights in Permutation Feature Importance calculation #564

Merged
merged 7 commits into from
May 5, 2024

Conversation

danielarifmurphy
Copy link
Contributor

@danielarifmurphy danielarifmurphy commented May 1, 2024

Issue #563

In this PR:

  • Use weights in loss_after_permutation()
  • Handle loss functions both with and without sample_weight parameter to maintain backwards compatibility with existing loss functions

Notes:

  • One approach would have been to extend all existing loss functions to be compatible with weights, but we thought the calculate_loss() to handle both cases to be more appropriate since this way we can handle any user-defined custom loss function that does not take weights
  • The explicit check for the parameter named sample_weight is in line with the scikit-learn signature - any custom loss functions wanting to use weights need to have the parameter named this way too

@danielarifmurphy danielarifmurphy marked this pull request as ready for review May 1, 2024 11:59
@hbaniecki
Copy link
Member

Thanks!

@hbaniecki hbaniecki merged commit 9884571 into ModelOriented:master May 5, 2024
0 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants