Discussion Group A2: Difference between revisions

From canSAS
m (moved A2 to Discussion Group A2)
(The discussion pointed to issue in management of SAS information content, combining SAS data with data from other measurements to reduce the uncertainties in SAS analysis and making SAS part of a siuite in addressing muli length scale structure.)
 
(4 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== Discsusion Group A2 ==
=== The Future of Data Analysis: Error Propagation and Simultaneous Fitting across Techniques ===


<b> Discusion Leaders: Rex Hjelm </b>
== Discussion Leaders ==


= Presentations =
<b>
* Rex Hjelm
</b>
 
== Presentations ==
* Introduction: [[media:Abstract Hjelm.pdf | Information Content, Error Propagation and Systematics]] - <b>Rex Hjelm</b>
* Introduction: [[media:Abstract Hjelm.pdf | Information Content, Error Propagation and Systematics]] - <b>Rex Hjelm</b>


= Discsussion Notes =
== Discussion Notes ==
Discussion Session on Information content of SAS and combining length scales.
The introduction to this session introduced three issues.
1. What is the information content of the SAS measurement?
2. Channel resolution/information content of single event data acquisition.
3. Combining SAS with other measurements as part of multi length scale investigations.
 
Discussion
1. Information content in an SAS measurement can be encoded as a figure of merit (FOM).
Example for SAS: <VI/σQ^2>ln(Qmax/Qmin), where VI and σQ^2 are Q-channel intensity uncertainty (expressed as a variance in the error propagation from the data reduction) and Q-distribution variance, respectively, with an average taken over the number of independent Q-channels nQ. The term,ln(Qmax/Qmin), encodes the bandwidth of the instrument in Q-space and is proportional to nQ in the reduced data domain, assuming that the instrument resolution in Q , δQ ~ Q.  In this case the number of independent bins is nQ = ln(Qmax/Qmin)/f, where f = δQ/Q and is assumed to be constant.  The FOM drives instrument design, choice of SAS instrument configuration and data reduction schemes.
In data reduction, do we calculate the uncertainty in term of variance, VI, correctly when we assume the central limit theorem and propagate the estimated uncertainties as RMS values that add in quadrature from Poisson statistics for each Q-channel for normalized foreground and background intensity?  In terms of the information content in each Q-channel, theory suggests that this is not the correct approach. The correct approach in some cases may change the measurement protocol. Are we correctly assessing the Q-distribution in each Q-channel and thus understand the binning scheme for optimal information content? Probably not.
 
2. The advent of single event data acquisition in TOF techniques multi-dimensional histograms. The issue of the resolution/information of the channels contributing to these histograms has not been addressed. Obviously this question is a function of both the time structure and geometry of instrument and the sample environment and the data acquisition scheme. For an instrument at a pulsed source, part of this question is to determine how the time-wavelength moderator spectrum is sampled by a given chopper configuration. This relates back to the points in issue 1 and is still an open issue.
3. The uniqueness problem of structural solutions from SAS and the limited length scale domain of SAS measurement techniques limits confidence in structural solutions and its usefulness in multi scale problems when used on its own. Combining SAS measurements with imaging techniques usually involves computing a squared Fourier transform of the image to extend the Q-domain. Information is lost in this process. X-ray diffraction results are commonly used as a starting point for comparison with SAS data, again by computing the SAS intensity from the XRD coordinates.  Methods are being developed to combine PDFs from powder diffraction and SAS data of nanoparticles to draw mutual inferences on the order-disorder pair-wise correlations of atoms in the particle.
Whereas Imaging data do not provide a large field of view of the ensemble, nonetheless it could serve as a good starting point for structure solutions from SAS using reverse Monte Carlo techniques. The final MC images resulting from this starting point could be compared with those resulting from starting from a random distribution. The use of AFM images might serve well as a starting point for GISAS. Likewise, with parametric, non-linear search algorithms, parameters derived from other techniques might be used as a starting point, then the resulting parameters compared with the starting point and with those derived from a neutral field of values. . Such approaches might be useful in increasing confidence in the resulting structures or parameters from SAS and also determine if the field of view from the microscopy techniques accurately described the average structure. Such approaches, which could include the results of NMR, say, will be valuable in providing a self-consistent model in which there is a high degree of confidence.
Comparison of result from SAS and microscopy and other techniques could also serve to uncover measurement artifacts.
 
Outcome:
Alex Hexemer to work with Rex Hjelm in using Alex’s MC code and test data from Rex including microscopy images to study the effect  of MC modeling of the SAS data starting from random distribution of material vs a distribution imposed by microscopy images.

Latest revision as of 23:50, 26 May 2015

The Future of Data Analysis: Error Propagation and Simultaneous Fitting across Techniques

Discussion Leaders

  • Rex Hjelm

Presentations

Discussion Notes

Discussion Session on Information content of SAS and combining length scales. The introduction to this session introduced three issues. 1. What is the information content of the SAS measurement? 2. Channel resolution/information content of single event data acquisition. 3. Combining SAS with other measurements as part of multi length scale investigations.

Discussion 1. Information content in an SAS measurement can be encoded as a figure of merit (FOM). Example for SAS: <VI/σQ^2>ln(Qmax/Qmin), where VI and σQ^2 are Q-channel intensity uncertainty (expressed as a variance in the error propagation from the data reduction) and Q-distribution variance, respectively, with an average taken over the number of independent Q-channels nQ. The term,ln(Qmax/Qmin), encodes the bandwidth of the instrument in Q-space and is proportional to nQ in the reduced data domain, assuming that the instrument resolution in Q , δQ ~ Q. In this case the number of independent bins is nQ = ln(Qmax/Qmin)/f, where f = δQ/Q and is assumed to be constant. The FOM drives instrument design, choice of SAS instrument configuration and data reduction schemes. In data reduction, do we calculate the uncertainty in term of variance, VI, correctly when we assume the central limit theorem and propagate the estimated uncertainties as RMS values that add in quadrature from Poisson statistics for each Q-channel for normalized foreground and background intensity? In terms of the information content in each Q-channel, theory suggests that this is not the correct approach. The correct approach in some cases may change the measurement protocol. Are we correctly assessing the Q-distribution in each Q-channel and thus understand the binning scheme for optimal information content? Probably not.

2. The advent of single event data acquisition in TOF techniques multi-dimensional histograms. The issue of the resolution/information of the channels contributing to these histograms has not been addressed. Obviously this question is a function of both the time structure and geometry of instrument and the sample environment and the data acquisition scheme. For an instrument at a pulsed source, part of this question is to determine how the time-wavelength moderator spectrum is sampled by a given chopper configuration. This relates back to the points in issue 1 and is still an open issue.

3. The uniqueness problem of structural solutions from SAS and the limited length scale domain of SAS measurement techniques limits confidence in structural solutions and its usefulness in multi scale problems when used on its own. Combining SAS measurements with imaging techniques usually involves computing a squared Fourier transform of the image to extend the Q-domain. Information is lost in this process. X-ray diffraction results are commonly used as a starting point for comparison with SAS data, again by computing the SAS intensity from the XRD coordinates. Methods are being developed to combine PDFs from powder diffraction and SAS data of nanoparticles to draw mutual inferences on the order-disorder pair-wise correlations of atoms in the particle. Whereas Imaging data do not provide a large field of view of the ensemble, nonetheless it could serve as a good starting point for structure solutions from SAS using reverse Monte Carlo techniques. The final MC images resulting from this starting point could be compared with those resulting from starting from a random distribution. The use of AFM images might serve well as a starting point for GISAS. Likewise, with parametric, non-linear search algorithms, parameters derived from other techniques might be used as a starting point, then the resulting parameters compared with the starting point and with those derived from a neutral field of values. . Such approaches might be useful in increasing confidence in the resulting structures or parameters from SAS and also determine if the field of view from the microscopy techniques accurately described the average structure. Such approaches, which could include the results of NMR, say, will be valuable in providing a self-consistent model in which there is a high degree of confidence. Comparison of result from SAS and microscopy and other techniques could also serve to uncover measurement artifacts.

Outcome: Alex Hexemer to work with Rex Hjelm in using Alex’s MC code and test data from Rex including microscopy images to study the effect of MC modeling of the SAS data starting from random distribution of material vs a distribution imposed by microscopy images.