Abstract
Normalizing flows are a powerful tool for generative modelling, density estimation and posterior reconstruction in Bayesian inverse problems. In this paper, we introduce proximal residual flows, a new architecture of normalizing flows. Based on the fact, that proximal neural networks are by definition averaged operators, we ensure invertibility of certain residual blocks. Moreover, we extend the architecture to conditional proximal residual flows for posterior reconstruction within Bayesian inverse problems. We demonstrate the performance of proximal residual flows on numerical examples.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
For the data generation and evaluation of this example, we use the code of [44] available at https://github.com/noegroup/stochastic_normalizing_flows.
References
Altekrüger, F., Hertrich, J.: WPPNets and WPPFlows: The power of Wasserstein patch priors for superresolution. arXiv preprint arXiv:2201.08157 (2022)
Arbel, M., Matthews, A., Doucet, A.: Annealed flow transport Monte Carlo. In: International Conference on Machine Learning, pp. 318–330. PMLR (2021)
Ardizzone, L., Kruse, J., Rother, C., Köthe, U.: Analyzing inverse problems with invertible neural networks. In: International Conference on Learning Representations (2018)
Ardizzone, L., Lüth, C., Kruse, J., Rother, C., Köthe, U.: Guided image generation with conditional invertible neural networks. arXiv preprint arXiv:1907.02392 (2019)
Behrmann, J., Grathwohl, W., Chen, R.T., Duvenaud, D., Jacobsen, J.H.: Invertible residual networks. In: International Conference on Machine Learning, pp. 573–582 (2019)
Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146(1), 459–494 (2014)
Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)
Chen, R.T.Q., Behrmann, J., Duvenaud, D.K., Jacobsen, J.H.: Residual flows for invertible generative modeling. In: Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Inc. (2019)
Chen, R.T., Rubanova, Y., Bettencourt, J., Duvenaud, D.K.: Neural ordinary differential equations. In: Advances in Neural Information Processing Systems, vol. 31 (2018)
Combettes, P.L., Pesquet, J.C.: Proximal splitting methods in signal processing. In: Bauschke, H., Burachik, R., Combettes, P., Elser, V., Luke, D., Wolkowicz, H. (eds.) Fixed-point algorithms for inverse problems in science and engineering. Springer Optimization and Its Applications, vol. 49, pp. 185–212. Springer, New York (2011). https://doi.org/10.1007/978-1-4419-9569-8_10
Combettes, P.L., Pesquet, J.C.: Deep neural network structures solving variational inequalities. Set-Valued Variational Anal. 28(3), 491–518 (2020)
Denker, A., Schmidt, M., Leuschner, J., Maass, P.: Conditional invertible neural networks for medical imaging. J. Imaging 7(11), 243 (2021)
Dinh, L., Krueger, D., Bengio, Y.: NICE: non-linear independent components estimation. In: Bengio, Y., LeCun, Y. (eds.) 3rd International Conference on Learning Representations, Workshop Track Proceedings (2015)
Dinh, L., Sohl-Dickstein, J., Bengio, S.: Density estimation using real NVP. In: International Conference on Learning Representations (2017)
Durkan, C., Bekasov, A., Murray, I., Papamakarios, G.: Neural spline flows. In: Advances in Neural Information Processing Systems (2019)
Glowinski, R., Osher, S.J., Yin, W.: Splitting Methods in Communication, Imaging, Science, and Engineering. Springer, Cham (2017)
Gouk, H., Frank, E., Pfahringer, B., Cree, M.J.: Regularisation of neural networks by enforcing Lipschitz continuity. Mach. Learn. 110(2), 393–416 (2021)
Grathwohl, W., Chen, R.T., Bettencourt, J., Sutskever, I., Duvenaud, D.: FFJORD: free-form continuous dynamics for scalable reversible generative models. In: International Conference on Learning Representations (2018)
Hagemann, P., Hertrich, J., Steidl, G.: Generalized normalizing flows via Markov Chains. arXiv preprint arXiv:2111.12506 (2021)
Hagemann, P., Hertrich, J., Steidl, G.: Stochastic normalizing flows for inverse problems: a Markov Chains viewpoint. SIAM/ASA J. Uncertainty Quantification 10(3), 1162–1190 (2022)
Hagemann, P., Neumayer, S.: Stabilizing invertible neural networks using mixture models. Inverse Prob. 37(8), 085002 (2021)
Hasannasab, M., Hertrich, J., Neumayer, S., Plonka, G., Setzer, S., Steidl, G.: Parseval proximal neural networks. J. Fourier Anal. Appl. 26, 59 (2020)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Heidenreich, S., Gross, H., Bär, M.: Bayesian approach to the statistical inverse problem of scatterometry: comparison of three surrogate models. Int. J. Uncertainty Quantification 5(6) (2015)
Heidenreich, S., Gross, H., Bär, M.: Bayesian approach to determine critical dimensions from scatterometric measurements. Metrologia 55(6), S201 (2018)
Hertrich, J., Neumayer, S., Steidl, G.: Convolutional proximal neural networks and plug-and-play algorithms. Linear Algebra Appl. 631, 203–234 (2021)
Hertrich, J., Steidl, G.: Inertial stochastic PALM and applications in machine learning. Sampling Theory Signal Process. Data Anal. 20(1), 4 (2022)
Higham, N.J.: Functions of Matrices: Theory and Computation. SIAM, Philadelphia (2008)
Horn, R.A., Johnson, C.R.: Matrix Analysis. Oxford University Press, Oxford (2013)
Huang, C.W., Chen, R.T., Tsirigotis, C., Courville, A.: Convex potential flows: universal probability distributions with optimal transport and convex optimization. In: International Conference on Learning Representations (2020)
Huang, C.W., Krueger, D., Lacoste, A., Courville, A.: Neural autoregressive flows. In: International Conference on Machine Learning, pp. 2078–2087 (2018)
Jaini, P., Kobyzev, I., Yu, Y., Brubaker, M.: Tails of Lipschitz triangular flows. In: International Conference on Machine Learning, pp. 4673–4681. PMLR (2020)
Kingma, D.P., Dhariwal, P.: Glow: Generative flow with invertible 1x1 convolutions. In: Advances in Neural Information Processing Systems, vol. 31 (2018)
Mirza, M., Osindero, S.: Conditional generative adversarial nets. arXiv preprint arXiv:1411.1784 (2014)
Miyato, T., Kataoka, T., Koyama, M., Yoshida, Y.: Spectral normalization for generative adversarial networks. In: International Conference on Learning Representations (2018)
Noé, F., Olsson, S., Köhler, J., Wu, H.: Boltzmann generators: sampling equilibrium states of many-body systems with deep learning. Science 365(6457), 1147 (2019)
Papamakarios, G., Pavlakou, T., Murray, I.: Masked autoregressive flow for density estimation. In: Advances in Neural Information Processing Systems, pp. 2338–2347 (2017)
Pesquet, J.C., Repetti, A., Terris, M., Wiaux, Y.: Learning maximally monotone operators for image recovery. SIAM J. Imaging Sci. 14(3), 1206–1237 (2021)
Pock, T., Sabach, S.: Inertial proximal alternating linearized minimization (iPALM) for nonconvex and nonsmooth problems. SIAM J. Imaging Sci. 9(4), 1756–1787 (2016)
Rezende, D., Mohamed, S.: Variational inference with normalizing flows. In: International Conference on Machine Learning, pp. 1530–1538. PMLR (2015)
Salmona, A., De Bortoli, V., Delon, J., Desolneux, A.: Can push-forward generative models fit multimodal distributions? In: Advances in Neural Information Processing Systems (2022)
Sedghi, H., Gupta, V., Long, P.M.: The singular values of convolutional layers. In: International Conference on Learning Representations (2018)
Sohn, K., Lee, H., Yan, X.: Learning structured output representation using deep conditional generative models. In: Advances in Neural Information Processing Systems, vol. 28 (2015)
Wu, H., Köhler, J., Noé, F.: Stochastic normalizing flows. Adv. Neural. Inf. Process. Syst. 33, 5933–5944 (2020)
Acknowledgements
Funding by the German Research Foundation (DFG) within the project STE 571/16-1 is gratefully acknowledged.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Hertrich, J. (2023). Proximal Residual Flows for Bayesian Inverse Problems. In: Calatroni, L., Donatelli, M., Morigi, S., Prato, M., Santacesaria, M. (eds) Scale Space and Variational Methods in Computer Vision. SSVM 2023. Lecture Notes in Computer Science, vol 14009. Springer, Cham. https://doi.org/10.1007/978-3-031-31975-4_16
Download citation
DOI: https://doi.org/10.1007/978-3-031-31975-4_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-31974-7
Online ISBN: 978-3-031-31975-4
eBook Packages: Computer ScienceComputer Science (R0)