-
Hello, I've run into some unexpected behavior with copying (or pickling/unpickling) import lmfit
import numpy as np
x = np.linspace(0, 1)
y = 2 * x + 1
model = lmfit.models.LinearModel()
model.set_param_hint("total", max=4)
model.set_param_hint("slope", expr="total - intercept")
guess = model.make_params(total=1.5)
fit = model.fit(y, x=x, params=guess)
print("Equal to copy:", fit.params == fit.params.copy())
print("Before:", fit.params.valuesdict())
print("After:", fit.params.copy().valuesdict())
print("total - intercept:", fit.params["total"].value - fit.params["intercept"].value)
# In case it's helpful:
print(fit.fit_report()) Output (lmfit=1.2.2):
This does not seem to happen on version 1.2.1. Of course, it would be possible to check for approximate equality of the parameters one-by-one using Thanks in advance for any thoughts you may have. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
@TMacioce Hm, yeah, that's kind of weird. I'm not entirely sure what's causing that. It looks to me like the first values -- those in |
Beta Was this translation helpful? Give feedback.
-
@TMacioce There is a PR (#907) that will fix this. Basically, the function added in 1.2.2 to create |
Beta Was this translation helpful? Give feedback.
@TMacioce There is a PR (#907) that will fix this. Basically, the function added in 1.2.2 to create
uncertainty ufloats
for each parameter was not correctly restoring values to the best-fit value, leaving the constrained expressions slightly out-of-sync with the variable parameters.