Opened 14 years ago
Closed 13 years ago
#176 closed feature (fixed)
Determine sensitivity of fits to parameters
Reported by: | ajj | Owned by: | ajj |
---|---|---|---|
Priority: | major | Milestone: | Analysis Wish List |
Component: | Analysis | Keywords: | |
Cc: | Blocking: | ||
Task: |
Description
Frank Heinrich has been doing this with reflectometry data :
Randomly move points within error and re-fit. Do this some number of times to determine the true sensitivity of the fit to each parameter.
Change History (1)
comment:1 Changed 13 years ago by srkline
- Resolution set to fixed
- Status changed from new to closed
Note: See
TracTickets for help on using
tickets.
I'll write this down so I can give the same answer every time. "Bootstrapping" or resampling of the data within it's known error to regenerate a "new" data set, then refitting (repeat N times) will generate a set of answers. For a well-behaved function, and normally distributed errors in the data, you'll get the errors on each parameter. Or you can fit with L-M (once) and estimate the errors using standard formulas, and get the same answer as resampling. Works for us (SANS) since our models converge to a unique solution (given an appropriate model). Reflectometry is a different case. Their data errors are still Poisson statistics, but the SLD profiles that are the fit results, is not necessarily a unique solution, but rather one of a family of solutions. In this case normal error propagation from "the" global minimum solution doesn't apply. To get a real error estimate on the SLD profile, resampling/refitting is neccessary.