You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The generate_training_data method of LatinHypercubeSampler requires as input a forward model that has been added to the problem definition and has an experiment associated with it. This results in a InverseProblem with two forward models: one physical (e.g. FE) model and one surrogate. To avoid evaluating the FE model during inference, we can only specify an likelihood model for the surrogate model. This results in the following warning:
| WARNING | The globally defined experiment 'TestSeriesFemModel' does not appear in any of the likelihood models!
This warning makes it unclear if this is the intended way of using the LatinHypercubeSampler. You can find an example based on a modified version of test_surrogate_model.py below:
importnumpyasnpfromprobeye.definition.inverse_problemimportInverseProblemfromprobeye.definition.forward_modelimportForwardModelBasefromprobeye.definition.surrogate_modelimportSurrogateModelBasefromprobeye.definition.sensorimportSensorfromprobeye.definition.likelihood_modelimportGaussianLikelihoodModelfromprobeye.surrogate.initial_samplingimportLatinHypercubeSamplerfromprobeye.inference.emcee.solverimportEmceeSolver# ============================================================================ ## Set numeric values ## ============================================================================ #n_walkers=20n_steps=200n_init_steps=100N_train=100# 'true' value of a, and its normal prior parametersm_true=2.5mean_m=2.0std_m=1.0# 'true' value of b, and its normal prior parametersb_true=1.7mean_b=1.0std_b=1.0# 'true' value of additive error sd, and its uniform prior parameterssigma=0.5low_sigma=0.0high_sigma=0.8# the number of generated experiment_names and seed for random numbersn_tests=50seed=1# ============================================================================ ## Define the Forward Model ## ============================================================================ #classExpensiveModel(ForwardModelBase):
definterface(self):
self.parameters= ["m", "b"]
self.input_sensors=Sensor("x")
self.output_sensors=Sensor("y", std_model="sigma")
defresponse(self, inp: dict) ->dict:
x=inp["x"]
m=inp["m"]
b=inp["b"]
return {"y": m*x+b}
# ============================================================================ ## Define the Surrogate Model ## ============================================================================ #classSurrogateModel(ExpensiveModel, SurrogateModelBase):
""" The inheritance from ExpensiveModel 'copies' the interface-method from ExpensiveModel (the surrogate model should have the same interface as the forward model). The inheritance from SurrogateModelBase is required to assign a forward model to the surrogate model, see surrogate_model.py. """defresponse(self, inp: dict) ->dict:
x=inp["x"]
m=inp["m"]
b=inp["b"]
return {"y": m*x+b}
# ============================================================================ ## Define the Inference Problem ## ============================================================================ ## initialize the inverse problem with a useful nameproblem=InverseProblem("Using a surrogate model")
# add all parameters to the problemproblem.add_parameter(
"m",
"model",
tex="$m$",
info="Slope of the graph",
prior=("normal", {"mean": mean_m, "std": std_m}),
)
problem.add_parameter(
"b",
"model",
info="Intersection of graph with y-axis",
tex="$b$",
prior=("normal", {"mean": mean_b, "std": std_b}),
)
problem.add_parameter(
"sigma",
"likelihood",
domain="(0, +oo)",
tex=r"$\sigma$",
info="Standard deviation, of zero-mean additive model error",
prior=("uniform", {"low": low_sigma, "high": high_sigma}),
)
# Add forward model to generate training dataforward_model=ExpensiveModel("ExpensiveModel")
problem.add_forward_model(forward_model)
# ============================================================================ ## Add test data to the Inference Problem ## ============================================================================ ## data-generation; normal likelihood with constant variance around each pointnp.random.seed(seed)
x_test=np.linspace(0.0, 1.0, n_tests)
y_true=forward_model.response(
{forward_model.input_sensor.name: x_test, "m": m_true, "b": b_true}
)[forward_model.output_sensor.name]
y_test=np.random.normal(loc=y_true, scale=sigma)
# add the experimental dataproblem.add_experiment(
f"TestSeries_FE",
fwd_model_name="ExpensiveModel",
sensor_values={
forward_model.input_sensor.name: x_test,
forward_model.output_sensor.name: y_test,
},
)
# Generate training datasampler=LatinHypercubeSampler(problem=problem)
train_samples, train_data=sampler.generate_training_data(forward_model, N_train)
# Add surrogate model to InverseProblemsurrogate_model=SurrogateModel("FastModel", forward_model=forward_model)
problem.add_forward_model(surrogate_model)
# Add surrogate model to the experimentproblem.add_experiment(
f"TestSeries_Surrogate",
fwd_model_name="FastModel",
sensor_values={
forward_model.input_sensor.name: x_test,
forward_model.output_sensor.name: y_test,
},
)
# ============================================================================ ## Add likelihood model(s) ## ============================================================================ ## add the likelihood model to the problemproblem.add_likelihood_model(
GaussianLikelihoodModel(
prms_def="sigma",
experiment_name="TestSeries_Surrogate",
model_error="additive",
name="SimpleLikelihoodModel",
)
)
# ============================================================================ ## Run solver ## ============================================================================ #emcee_solver=EmceeSolver(
problem,
show_progress=True,
)
inference_data=emcee_solver.run_mcmc(
n_walkers=n_walkers,
n_steps=n_steps,
n_initial_steps=n_init_steps,
vectorize=False
)
The text was updated successfully, but these errors were encountered:
The
generate_training_data
method of LatinHypercubeSampler requires as input a forward model that has been added to the problem definition and has an experiment associated with it. This results in a InverseProblem with two forward models: one physical (e.g. FE) model and one surrogate. To avoid evaluating the FE model during inference, we can only specify an likelihood model for the surrogate model. This results in the following warning:This warning makes it unclear if this is the intended way of using the LatinHypercubeSampler. You can find an example based on a modified version of
test_surrogate_model.py
below:The text was updated successfully, but these errors were encountered: