What is the difference between the GUM and Monte Carlo uncertainty analysis methods?
Both methods require the identification and evaluation of error distributions for the measurement process being analyzed. Consequently, appropriate error distributions must be applied to achieve realistic results from either analysis method. The main difference between the GUM and Monte Carlo methods is the way that the uncertainty in the combined error is computed. The Monte Carlo method employs repeated computation of random or psuedo-random numbers to simulate and combine deviates for each error distribution. The combined uncertainty is the computed standard deviation of the combined error distribution. The GUM method employs mathematical and statistical functions to compute and combine the variances of each error distribution. The combined uncertainty is computed by taking the square-root of the combined variance. Taylor Series approximation is employed for analyzing multivariate measurements. The GUM and Monte Carlo methods can accommodate correlated errors. However, applying the
Related Questions
- How will the new guidance on analytical methods and uncertainty analysis for online monitoring systems be used by the staff?
- OSU Physics Faqs : Computing (general) : Numerical Analysis : Where can I find out about Monte Carlo methods?
- What is the Difference Between a Computer Security Review and a Computer Security Risk Analysis?