Opened 9 years ago
Last modified 9 years ago
#398 new task
Processing of very large files
Reported by: | srkline | Owned by: | srkline |
---|---|---|---|
Priority: | major | Milestone: | TISANE |
Component: | SANS Reduction | Keywords: | |
Cc: | Blocking: | ||
Task: |
Description
Still need to define the problem. Sit down with Paul, maybe Jeff Lynn, and ?? to find out what the question really is, before looking for an answer.
Note: See
TracTickets for help on using
tickets.
Look into a quick way of decimating the data 10x, maybe 100x so that the whole time set can be viewed at once, both in 2D to see changes there, and in the count rate (counts per bin) to see changes there. Then the full set can be processed accordingly.
Do this in a way that requires a minimum of changes to the XOP -- Decimate is built into Igor (see Resample)