A solution of the completion problem of symmetric positive definite matrices. Symmetric positive definite matrices play an important role in statistics. Typical examples are the covariance and correlation matrices which capture the second order statistics between random variables. In modelling studies these matrices are usually estimated from available data. Unfortunately, this data is not always sufficient to estimate all coefficients of the matrix. For statistical modelling, a complete matrix is needed and for this reason sensible estimates are needed for undefined coefficients. The entropy is a well accepted measure for the information content of a probability density function (pdf). The values given to the undefined coefficients can be chosen such that the information content of the corresponding multi-normal distribution is minimal. In this way a pdf is obtained that honours the available data and does not impose unnecessary extra constraints. A constraint cross entropy minimisation problem has to be solved to compute the values of the missing coefficients. The cross entropy optimization problem can be easily derived from first principles and results in an equation with Lagrange multipliers. The system of equations for the Lagrange multipliers is solved with a Newton iteration. A generalisation is possible such that a complete covariance matrix can be computed from second order statistics between linear combinations of random variables. The method has for instance been used to construct a prior pdf for reservoir models or to compute a cross variogram between two correlated lateral continuous random variables.


Article metrics loading...

Loading full text...

Full text loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error