Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sector integration using azimuthalIntegrator.integrate_1d doesn't work #2246

Open
y-hhung opened this issue Aug 7, 2024 · 1 comment
Open

Comments

@y-hhung
Copy link

y-hhung commented Aug 7, 2024

Dear all,

I have been using the azimuthal integrator module for some GIWAXS data, where I am interested in studying the integrated data within a certain azimuthal range instead of the whole range. I find that whatever azimuthal range I use, the output data is the same, suggesting that it isn't correctly doing the azimuthal integration.

Below are the functions I use. I would like to be able to integrate the data properly for certain values of chi_pos and chi_width.

If anyone has any advice this would be much appreciated!

Thanks,

Esthy

def pyfai_1D(df, filename, qbins, chi_pos = 0, chi_width = 180, diagnostics=False,mask=False,maskfile='',rotate=False) :
    filedf = df.loc[[filename]]
    
    if diagnostics == True : print(filedf)
    
    detector  = pyFAI.detectors.Detector(
        filedf.pixel_size[0], # pixel dimension x in m
        filedf.pixel_size[0], # pixel dimension y in m
        max_shape=(filedf.shape_x[0],filedf.shape_y[0]) # array size in pixels
    )
    
    ai = pyFAI.azimuthalIntegrator.AzimuthalIntegrator(
     
        poni1 = (filedf.shape_y[0]-filedf.beam_center_y[0])*filedf.pixel_size[0], # factoring in rotation
        poni2 = (filedf.shape_x[0]-filedf.beam_center_x[0])*filedf.pixel_size[0], # factoring in rotation                
        detector = detector,
        rot2 = np.deg2rad(filedf.twothet[0]),
        wavelength = filedf.wavelength[0],
        dist = filedf.distance[0]
    )
           
    pg = pygix.Transform()
    pg.load(ai)
    pg.incident_angle = filedf.incident_angle[0]
    
    if mask == True:
        pg.maskfile = maskfile    
    
    if diagnostics == True :
        print(ai)

    sample_orientation = 3 

    pg.sample_orientation = sample_orientation   
    img = np.rot90(np.rot90(fabio.open(filedf.fileloc[0]).data))
    
    qmin = 0 # minimum Q # 0.01 to 2.5
    qmax = 2 # maximum Q
    
    q, intensity = ai.integrate1d(img, npt=qbins, unit="q_A^-1", radial_range=(qmin, qmax), azimuth_range=(chi_pos-chi_width/2, chi_pos+chi_width/2), correctSolidAngle=True)

    return intensity,q

def mass_convert_1D_combine_pyfai(filedir, label ='',chi_pos = 0, chi_width = 180, qbins = 800, vmin = 0, vmax = 100, ymax = 2, sqrt = False, combining = False, plotting=False) :      
    dtrek_files = glob.glob(filedir+'\*.img')
    dtrek_files = sorted(dtrek_files)

    vals = dtrek_vals(dtrek_files[0])
    df = pd.DataFrame.from_dict(vals)
    df = df.drop(0)
    
    for dtrek_file in dtrek_files :    
        vals = dtrek_vals(dtrek_file)
        df = df._append(pd.DataFrame.from_dict(vals)) #, index = 0, orient='columns', dtype=None
    df = df.set_index('filename')
    
    df_1D = pd.DataFrame()
    if combining:
        intensities = []

    if combining == False:
        for dtrek_file in dtrek_files :       
            
            filename = os.path.split(dtrek_file)[1]        
            print(f'Processing: {filename}\r', end="")
            intensity, q = pyfai_1D(df, filename, qbins, diagnostics=False,mask=False,maskfile='',rotate=False)

            df_1D["Q_"+filename] = q # /10 to angstrom if not in function
            df_1D["I_"+filename] = intensity
            plt.plot(q, intensity)
            plt.xlabel('q (A^-1)')
            plt.xlim(0.1,2.5)
            plt.xticks(np.arange(0, 2.5, step=0.25)) 
            plt.ylabel('Intensity')
            plt.title(f'1D Integration for {filename}')
            plt.show()
        
            df_1D.to_csv(os.path.join(filedir,'1Dintegrations.csv'), index=False)
        
    else:
        for dtrek_file in dtrek_files :       
            
            filename = os.path.split(dtrek_file)[1]        
            print(f'Processing: {filename}\r', end="")
            intensity,q = pyfai_1D(df, filename, qbins,diagnostics=False,mask=False,maskfile='',rotate=False)

            df_1D["Q"] = q # /10 to angstrom if not in function
            intensities.append(intensity)
        GIarray = np.sum(intensities, axis=0)    
        df_1D["I"] = GIarray
        
        fig,ax = plt.subplots(figsize=(10,6))  
        ax.plot(q, GIarray)
        #ax.set_xscale('log')
        ax.xaxis.set_minor_formatter(ScalarFormatter())    
        ax.xaxis.set_major_formatter(ScalarFormatter())
        plt.xlim(0.1,2)
        #plt.xticks(np.arange(0, 2.5, step=0.25)) 
        plt.xlabel('q (A$^{-1}$)')
        plt.ylabel('Intensity')
        plt.title(f'1D Integration for {filename}')
        plt.show()
        
        df_1D.to_csv(os.path.join(filedir,label+'1Dintegrations_combined.csv'), index=False)
@kif
Copy link
Member

kif commented Aug 19, 2024

Hi Esthy,
pygix did not receive much update for several years, thus Edgar ported most of its features directly into pyFAI.
Beside this, I see no issue.
Could you please print your "azimuth_range" and validate that it is correct ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants