Replies: 9 comments 8 replies
-
Wait, are the first two images cleaned and the remaining images dirty? The dirty images don't look particularly oversubtracted to me (are they the correct images?). The large holes you are seeing seem much clearer in the deconvolved images but that means it is very difficult to tell where they are coming from. Could you please include images of the unaveraged data as a point of comparison? Also which imager are you using? Could you please also post the model images or point me to the fits files - I would like to see the magnitude of the negative components near those sources. How good is the model for the DDE affected sources? Apologies for the million questions. |
Beta Was this translation helpful? Give feedback.
-
1.) Yes the first two images are deep cleaned images, and the UV-cutouts were done for the dirty images only to save time. Yes, they do not look as over-subtracted as the deep images (the deep images are about a magnitude more subtracted), but there are still holes there nevertheless, just not as deep as the cleaned ones. Is this not because they are only dirty images? 2.) I can do the un-averaged ones now. I was using WSClean with 3.) For the above-cleaned image, the model image with the corresponding cleaned image for the regular average is here: Or you can access it over here |
Beta Was this translation helpful? Give feedback.
-
I used this: quartical_2:
cab: quartical
info: "Peel off axis source"
params:
input_ms.path: '{recipe.ms}'
input_ms.is_bda: true
input_ms.weight_column: WEIGHT_SPECTRUM
input_ms.time_chunk: 0
#input_ms.select_uv_range: [1400,0]
solver.terms: [K,de]
K.type: phase
K.direction_dependent: false
K.freq_interval: '0'
K.time_interval: '4'
de.time_interval: '10'
K.initial_estimate: false
de.type: complex
de.freq_interval: '64'
solver.iter_recipe: [100,50]
input_model.recipe: MODEL_DATA~DIR1_DATA~DIR2_DATA~DIR3_DATA:DIR1_DATA:DIR2_DATA:DIR3_DATA
output.overwrite: 'true'
output.products: [corrected_data, corrected_residual]
output.columns: [CORRECTED_DATA,CORRECTED_RESIDUAL]
output.subtract_directions: [1,2,3] |
Beta Was this translation helpful? Give feedback.
-
Just a note: I think we should also clarify here that K is not delay "K" here - it is just a phase only gain if I understand K.type correctly right (as in "phase-diag" in cubical equivalent speak)? I can't immediately see what types of gains are available from the quartical/config/helpstrings.yaml helpstrings. Still at a bit of puzzled, but yes it could be a global amplitude bias, quite correct. I think we can reasonably confidently do K + G(amp) + dE with the model we have. |
Beta Was this translation helpful? Give feedback.
-
BDA now consistent with regular averaging. Final DD-subtracted images: |
Beta Was this translation helpful? Give feedback.
-
Great work.
I think we can start working on the paper with this in hand.
…On Mon, 28 Nov 2022, 11:53 Oleg Smirnov, ***@***.***> wrote:
Yes! 🍾
—
Reply to this email directly, view it on GitHub
<#202 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AB4RE6R65JGXXU4DRBSLOB3WKR6KFANCNFSM6AAAAAAQSE2PD4>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Apologies for not replying immediately @Kincaidr! Your results look awesome - thanks for all the work you have done on this. |
Beta Was this translation helpful? Give feedback.
-
I am thinking of doing this for one more data-set, I already have it at hand. If it's too much trouble I can just leave it, I will start the draft paper simultaneously regardless. What do you think of showcasing this for 2 fields? |
Beta Was this translation helpful? Give feedback.
-
Hi Robert
I think one field is enough to demonstrate (I don't want to unnecessarily
draw out the paper). Remember that we have to recap the theory, the
compression implementation qualification and quantification testing plus
discuss the calibration routine implementation details and then the
demonstration of calibration on real data along with a reasonable detailed
discussion on the pipeline.
Thoughts @sjperkins @JSKenyon ??
…On Mon, Dec 5, 2022 at 11:31 AM Kincaidr ***@***.***> wrote:
I am thinking of doing this for one more data-set, I already have it at
hand. If it's too much trouble I can just leave it, I will start the draft
paper simultaneously regardless. What do you think of showcasing this for 2
fields?
—
Reply to this email directly, view it on GitHub
<#202 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AB4RE6SNTSD7THXFB7KDBSTWLWY73ANCNFSM6AAAAAAQSE2PD4>
.
You are receiving this because you commented.Message ID:
***@***.***>
--
--
Benjamin Hugo
PhD. student,
Centre for Radio Astronomy Techniques and Technologies
Department of Physics and Electronics
Rhodes University
Junior software developer
Radio Astronomy Research Group
South African Radio Astronomy Observatory
Black River Business Park
Observatory
Cape Town
|
Beta Was this translation helpful? Give feedback.
-
We are trying to solve the over-subtraction problem we are getting with both BDA and regular averaging.
For re-cap, the final
K + de
solve images looked like:Regular time channel left, BDA right:
We suspected that the shorter spacings are picking up a lot of the low-level unmodelled flux that are biasing the solutions towards the sources and over-subtracting as a result. To check this, I did the
K + de
with 750 , 1000, 1450 UV-cuts. Below I have shown these dirty images along with the no-UV-cut dirty image. They are ordered as no uv-cut (top left), 750 (top right), 1000 (bot left), 1450 (bot right) for the two problematic sources :Ive found the min for these two sources too see if over-subtraction results and possible trends. I got:
Source 1 (top):
no uvcut: -1.74; 750 uvcut: -1.90; 1000 uvcut: -1.58; 1400 uvcut: -1.75
Source 2 (bottom):
no uvcut: -1.94; 750 uvcut: -1.55; 1000 uvcut: -1.75; 1400 uvcut: -1.88
It seems like increasing the uv-cut is resulting in the same over-subtraction as no-uvcut. Is there any other way I can quantify this over-subtraction?
Beta Was this translation helpful? Give feedback.
All reactions