Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrades to forward modeling util #299

Merged
merged 2 commits into from
Dec 6, 2024
Merged

Conversation

jfcrenshaw
Copy link
Collaborator

Just a few small upgrades to the forward modeling util that were useful for tests I was running.

@jfcrenshaw jfcrenshaw requested a review from jbkalmbach December 4, 2024 21:57
Copy link
Member

@jbkalmbach jbkalmbach left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Minor comments.

@@ -97,17 +103,37 @@ def forwardModelPair(
only specified for the intra or extrafocal donut, both donuts use
that angle. If neither is specified, the same random angle is used
for both. (the default is None)
miscenterIntra : tuple, optional
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is it useful to have the default not be 0? Is it worth the bit of confusion that it may cause when somebody may assume that not entering anything in this optional parameter will still create a non-zero value?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added it because using perfectly centered donuts causes some asymmetry in errors for Zernikes that are azimuthal pairs. I assume it's just a pixel aliasing thing. Adding small, random dithers fixes that issue. This amount of miscentering doesn't cause problems for the TIE or Danish, and I don't think it's bad to assume that stamps aren't perfectly centered. I'd be fine to set the default to 0 though if you think that's preferable.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the end product is more robust I think it's good to leave it. Maybe add a version of this explanation somewhere in the docstring or code so we understand when we look at this later.

intraStamp.image += background
intraStamp.image = np.clip(intraStamp.image, 0, None)
intraStamp.image = rng.poisson(intraStamp.image).astype(float)
intraStamp.image += rng.normal(size=intraStamp.image.shape) * np.sqrt(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just trying to understand why this change?

Copy link
Collaborator Author

@jfcrenshaw jfcrenshaw Dec 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the Poisson, numpy sometimes complains "lam value too large". It's supposedly an issue with the float type in the array. I tried casting to different float types but didn't get it working. I decided it wasn't worth spending more time on, so just changed it to pseudo-Poissonian, which doesn't hit that error ¯_(ツ)_/¯

@jfcrenshaw jfcrenshaw force-pushed the forwardModel-improvements branch from a1a9f43 to cd07fbd Compare December 6, 2024 01:54
@jmeyers314 jmeyers314 mentioned this pull request Dec 6, 2024
@jfcrenshaw jfcrenshaw merged commit 29560ee into develop Dec 6, 2024
4 checks passed
@jfcrenshaw jfcrenshaw deleted the forwardModel-improvements branch December 6, 2024 22:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants