Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix size attribute error for precision/recall/f1 #656

Merged
merged 2 commits into from
Jan 10, 2025

Conversation

Maxwell-Jia
Copy link
Contributor

What does this PR do?

Fix #655 .
Fix AttributeError in precision/recall/f1 metrics when handling scalar outputs from scikit-learn 1.6.0.

Description

With the release of scikit-learn 1.6.0, some metric functions (e.g., precision_score, recall_score, f1_score) may return float values instead of numpy arrays for single-value results. The current implementation in evaluate assumes the presence of a size attribute for all outputs, which causes an AttributeError when handling scalar outputs.

This PR modifies the return statement to safely handle both numpy arrays and scalar outputs using getattr(score, 'size', 1), making the metrics compatible with both scikit-learn 1.6.0 and earlier versions.

Changes

Modified return statements in three metrics:

  • metrics/precision/precision.py
  • metrics/recall/recall.py
  • metrics/f1/f1.py

Changed from:

return {"metric_name": float(score) if score.size == 1 else score}

to:

return {"metric_name": score if getattr(score, 'size', 1) > 1 else float(score)}

@Maxwell-Jia
Copy link
Contributor Author

Hi @albertvillanova! I noticed you've recently reviewed some PRs in this area. I hope it's okay to bring this to your attention - this PR addresses the scikit-learn 1.6.0 compatibility issue (from #655) that's affecting several users. Would really appreciate your insights when you have a moment. Thank you for your time!

@SiddharthSingi
Copy link

SiddharthSingi commented Jan 9, 2025

Someone please approve this PR, its causing all basic metrics to fail. Even the basic examples given on the evaluate metric websites are not working

Copy link

@SiddharthSingi SiddharthSingi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This checks for the type of the output and solves the error of float not having a size error.

@tomaarsen tomaarsen merged commit 5aa3982 into huggingface:main Jan 10, 2025
4 of 6 checks passed
@tomaarsen
Copy link
Member

Thanks for raising this! I'm afraid evaluate is no longer actively being maintained, but this is such a small fix that I figured I'd merge it.
I believe the commit should over time automatically be propagated to the Hugging Face spaces, after which the problem should be resolved fully.

  • Tom Aarsen

@Maxwell-Jia Maxwell-Jia deleted the fix-metric-size-attribute branch January 13, 2025 02:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Precision and recall metrics doesn't work with scikit-learn==1.6.0
4 participants