1887

Abstract

Summary

Numerous modelling tasks require the solution of ill-posed inverse problems where we seek to find a distribution of earth models that match observed data such as reflected acoustic waveforms or produced hydrocarbon volumes. We present a deep learning framework to create stochastic samples of posterior property distributions for ill-posed inverse problems using a gradient-based approach. The spatial distribution of petrophysical properties is created by a deep generative model and controlled by a set of latent variables. A generative adversarial network (GAN) is used to represent a prior distribution of geological models based on a training set of object-based models. Then we minimize the mismatch between observed ground-truth data and numerical forward-models of the generator output by first computing gradients of the objective function with respect to grid-block properties and then using neural network backpropagation to obtain gradients with respect to the latent variables. Synthetic test cases of acoustic waveform inversion and reservoir history matching are presented. In seismic inversion, we use a Metropolis adjusted Langevin algorithm (MALA) to obtain posterior samples. For both synthetic cases, we show that deep generative models such as GANs can be combined in an end-to-end framework to obtain stochastic solutions to geophysical inverse problems.

Loading

Article metrics loading...

/content/papers/10.3997/2214-4609.201901609
2019-06-03
2020-07-05
Loading full text...

Full text loading...

References

  1. Ardizzone, L., Kruse, J., Wirkert, S., Rahner, D., Pellegrini, E.W., Klessen, R.S., Maier-Hein, L.
    , Rother, C. and Köthe, U. [2018] Analyzing inverse problems with invertible neural networks. arXiv preprint arXiv:1808.04730.
    [Google Scholar]
  2. Bottou, L., Curtis, F.E. and Nocedal, J
    . [2018] Optimization methods for large-scale machine learning. SIAM Review, 60(2), 223–311.
    [Google Scholar]
  3. Dinh, L.
    , Sohl-Dickstein, J. and Bengio, S. [2016] Density estimation using Real NVP. arXiv preprint arXiv:1605.08803.
    [Google Scholar]
  4. Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A. and Bengio, Y
    . [2014] Generative adversarial nets. In: Advances in neural information processing systems. 2672–2680.
    [Google Scholar]
  5. Hegde, C
    . [2018] Algorithmic Aspects of Inverse Problems Using Generative Models. Arxiv Preprint Arxiv: 1810.03587.
    [Google Scholar]
  6. Hinton, G., Srivastava, N. and Swersky, K
    . [2012] Neural networks for machine learning lecture 6a overview of mini-batch gradient descent. Cited on, 14.
    [Google Scholar]
  7. Kingma, D.P. and Ba, J
    . [2014] Adam: A Method for Stochastic Optimization. Arxiv Preprint Arxiv: 1412.6980.
    [Google Scholar]
  8. Kingma, D.P. and Dhariwal, P
    . [2018] Glow: Generative flow with invertible 1x1 convolutions. In: Advances in Neural Information Processing Systems. 10236–10245.
    [Google Scholar]
  9. Kingma, D.P. and Welling, M
    . [2013] Auto-Encoding Variational Bayes. Arxiv Preprint Arxiv: 1312.6114.
    [Google Scholar]
  10. Laloy, E., Hérault, R., Lee, J., Jacques, D. and Linde, N
    . [2017] Inversion using a new low-dimensional representation of complex binary geological media based on a deep neural network. Advances in Water Resources, 110, 387–405.
    [Google Scholar]
  11. Mosser, L., Dubrule, O. and Blunt, M.J
    . [2018] Stochastic Seismic Waveform Inversion Using Generative Adversarial Networks as a Geological Prior. Arxiv Preprint Arxiv: 1806.03720.
    [Google Scholar]
  12. Nguyen, A., Clune, J., Bengio, Y., Dosovitskiy, A. and Yosinski, J
    . [2017] Plug & play generative networks: Conditional iterative generation of images in latent space. In: CVPR, 2. 7.
    [Google Scholar]
  13. Radford, A., Metz, L. and Chintala, S.
    [2015] Unsupervised representation learning with deep convolu-tional generative adversarial networks. arXiv preprint arXiv:1511.06434.
    [Google Scholar]
  14. Rezende, D.J. and Mohamed, S
    . [2015] Variational Inference with Normalizing Flows. Arxiv Preprint Arxiv: 1505.05770.
    [Google Scholar]
  15. Robbins, H. and Monro, S
    . [1985] A stochastic approximation method. In: Herbert Robbins Selected Papers, Springer, 102–109.
    [Google Scholar]
  16. Ruder, S
    . [2016] An Overview of Gradient Descent Optimization Algorithms. Arxiv Preprint Arxiv:1609.04747.
    [Google Scholar]
  17. Rumelhart, D.E., Hinton, G.E. and Williams, R.J
    . [1986] Learning representations by back-propagating errors. Nature, 323(6088), 533.
    [Google Scholar]
  18. Shah, V. and Hegde, C
    . [2018] Solving Linear Inverse Problems Using Gan Priors: An Algorithm with Provable Guarantees. Arxiv Preprint Arxiv:1802.08406.
    [Google Scholar]
  19. Welling, M. and Teh, Y.W
    . [2011] Bayesian learning via stochastic gradient Langevin dynamics. In: Proceedings of the 28th International Conference on Machine Learning (ICML-11). 681–688.
    [Google Scholar]
http://instance.metastore.ingenta.com/content/papers/10.3997/2214-4609.201901609
Loading
/content/papers/10.3997/2214-4609.201901609
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error