Ignore:
Timestamp:
Jul 17, 2018 1:58:12 PM (4 years ago)
Author:
srkline
Message:

two significant changes:

1- in the averaging, before the data is finally written out, duplicate q-values (within 0.1%) are averaged so that duplicate q-values do not appear in the data file. This has a very bad effect on the calculation of the smearing matrix (dq=0).

2-for the data from the back (high res) detector, two steps have been added to the processing at the point where the data is converted to WORK. First, a constant read noise value of 200 cts/pixel is subtracted (found from average of multiple runs with beam off) Second, a 3x3 median filter is applied to the whole image to eliminate the stray bright pixels. Some are saturatd, some are simply outliers. Very effective.

File:
1 edited

Legend:

Unmodified
Added
Removed
  • sans/Dev/trunk/NCNR_User_Procedures/Reduction/VSANS/V_Protocol_Reduction.ipf

    r1109 r1111  
    32083208                                        V_ConcatenateForSave("root:Packages:NIST:VSANS:",activeType,"",binType)         // this removes q=0 point, concatenates, sorts 
    32093209                                 
     3210                                        V_RemoveDuplicateQvals("root:Packages:NIST:VSANS:",activeType)          // works with the "tmp_x" waves from concatenateForSave 
    32103211//                                      prot[9] = collimationStr 
    32113212                                         
Note: See TracChangeset for help on using the changeset viewer.