Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

choise of optimizer #2

Open
andreashlarsen opened this issue Sep 25, 2024 · 0 comments
Open

choise of optimizer #2

andreashlarsen opened this issue Sep 25, 2024 · 0 comments
Labels
enhancement New feature or request question Further information is requested

Comments

@andreashlarsen
Copy link
Owner

I thank reviewer 3 for raising this issue (rephrased by AHL):
scipy.optimize.curve_fit is used to carry out most of the minimisations.

Why isn't a more general minimiser used, i.e. scipy.optimize.minimize? Furthermore, why not use a global minimiser (e.g. scipy.optimize.differential_evolution), which is more powerful in real-life contexts as it is less likely to fall into local minima?

These general mimimisers would allow you the choice of minimising negative log-likelihood or negative log-posterior, instead of restricting you to a least squares situation where you have to recast all the log-priors into a sum of squares. Indeed, it's just easier to add log-probabilities for all the prior probability distributions, and it doesn't restrict you to a Gaussian prior, you could have whatever probability distribution you felt like.

@andreashlarsen andreashlarsen added enhancement New feature or request question Further information is requested labels Sep 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant