Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Log the validation loss when using CLI while training an omnipose model #86

Open
lacan opened this issue Apr 8, 2024 · 1 comment
Open

Comments

@lacan
Copy link

lacan commented Apr 8, 2024

Looking through the log and the documentation at https://omnipose.readthedocs.io/training.html

I do not see how I can access the validation loss when training a new omnipose model. It would be necessary to see the performance of my model by looking at how both the training loss and validation loss change over time per epoch.

Could this please be added to the verbose outpus of the log when training an omnipose model using the CLI?

Otherwise, how do I access it via Python?

This is somewhat linked to #16

@kevinjohncutler
Copy link
Owner

I agree that the validation loss, if computed, should be added to the logs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants