Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New @metric decorator #505

Open
wants to merge 5 commits into
base: main
Choose a base branch
from
Open

New @metric decorator #505

wants to merge 5 commits into from

Conversation

ecomodeller
Copy link
Member

No description provided.

@ecomodeller ecomodeller marked this pull request as ready for review February 27, 2025 06:47
@ecomodeller ecomodeller changed the title Move large/small to metrics New @metric decorator Feb 27, 2025


def metric(best: str | None = None, has_units: bool = False):
"""Decorator to attach a 'best' attribute to metric functions."""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update docstring with all 3 functionality

@jsmariegaard
Copy link
Member

I wonder if we could use the same decorator concept to handle the aliasing concept in metrics 🤔 right now it is confusing that we have e.g. mae and mean_absolute_error which is the same thing. In a list of available metrics they should not appear as two separate entries and I guess in practice one would never user mean_absolute_error as it would be clumsy in a table. Maybe we could instead have a long_name attribute on mae or display_name on the long one or something.

I guess it would be better to postpone this to a future PR though...

@ecomodeller
Copy link
Member Author

ecomodeller commented Feb 27, 2025

TODO:

  • move handling of bias and lin_slope from skill.py to metrics.py
 _one_is_best_metrics = ["lin_slope"]
 _zero_is_best_metrics = ["bias"]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants