Error in Numerical Models Fitted to Data

presented at

DSRC/DARPA Study

Numerical Simulation of Physical Systems:
The State of the Art, and Opportunities for Further Advances

Kick-Off Meeting

19-20 January 1999

P.B. Stark
Department of Statistics
University of California, Berkeley

 


Outline

  1. Some sources of uncertainty

  2. Lessons I've learned

  3. Tools to measure uncertainty

  4. Recommendations


Some Sources of Uncertainty

Data uncertainties
Model (physical) uncertainty
Approximation of the solution
Approximate calculation/estimation

 


My (Limited) Experience

I've worked on problems in cosmology, demography, earthquake prediction, electroencephalography, fishery management, geochemistry, geomagnetism, gravimetry, helioseismology, IC mask manufacture, seismic imaging, spectrum estimation, tomography, water treatment,

In my experience, the data errors are the most difficult to assess, but the approximation of the solution is the most often completely ignored.

In my experience, coding errors can evade years of testing and debugging, even in fairly straightforward problems.

In my experience, very little can be inferred from data without a priori constraints---knowledge of the subject matter is crucial.

In my experience, what scientists instinctively try to estimate, and what they can estimate, are quite different.

In my experience, scientists often think they need to estimate an entire function (e.g., a distributed parameter of a PDE) to answer the scientific question.

Unfortunately, in every problem I've seen, the bias in function estimates is unbounded unless there are extremely restrictive a priori constraints on the function. 

I have never seen a priori constraints sufficiently stringent to reduce the bias in estimating a function to practical levels.

In my experience, many or most scientific questions can be reduced to inference about scalars, rather than inference about functions---but it is rarely done.

I have seen examples in cosmology, geomagnetism, geochemistry, gravimetry, helioseismology, and seismic tomography, in which physically motivated constraints (together with the data) suffice to make inferences about interesting functionals of the model, properly accounting for discretization error and systematic data error.

Scientists and statisticians need each other to hone the scientific questions, to uncover useful constraints, to devise useful, practical experiments, and to understand the limitations of the data.

My greatest feelings of success: geochemistry at SDR and helioseismology with the GONG project.

 

Some Caveats

Assuming that the world is low-dimensional does not make it so.

False heuristics:

  1. The world is linear, and data have Gaussian errors. (Even if it were, data processing tends to corrupt things.)

  2. Experimenters know the size of the random observational errors in their experiments

  3. Experimenters know the size of the systematic errors in their experiments (countless examples of incommensurable experiments)

  4. If the data collection / forward model has limited resolution, the solution needs only limited resolution.

  5. If you don't know something, set it to zero and proceed (the ostrich axiom).

  6. If there is something you would like to estimate, it is estimable.

  7. Ignorance is equivalent to a flat (uniform) prior probability distribution.

 

More things that mislead:

  1. Reproducing results using the same or similar data and qualitatively similar algorithms (example in seismic tomography of the core-mantle boundary)

  2. Taking the curvature of the objective functional at the putative optimizer as a measure of the overall uncertainty (examples in cosmology and helioseismology)

  3. Reproducing a model from synthetic data, when the synthetic data derive from the same parametrization used in the estimator (examples everywhere)

  4. Reflexive use of standard statistical tests, or use of inappropriate tests (examples in the census, seismic tomography, earthquake prediction, ... )

  5. "Formal" errors (examples everywhere)

  6. Using likelihoods as posterior probabilities.

 


Tools to Measure Uncertainty

Optimization, optimization, optimization.

Easiest approach: falsification of hypotheses using finite-dimensional optimization.

Math tools to probe infinite-dimensional problems:

A general-purpose approach that gives conservative bounds:

Some statistically optimal procedures for finding confidence intervals are in the class of estimators this gives.

The misfit measure can be tailored for the geometry of the observation functionals, the systematic errors, the constraints, and the functional to be estimated.

Often l2 measure of misfit (as in least-squares) is not optimal (even can be statistically inconsistent).

This approach can be quite demanding computationally.


Recommendations

 

P.B. Stark. 21 January 1999