Tim Sullivan

#neurips

Clear Search

Implicit probabilistic integrators for ODEs

Implicit Probabilistic Integrators in NeurIPS

The paper “Implicit probabilistic integrators for ODEs” by Onur Teymur, Han Cheng Lie, Ben Calderhead and myself has now appeared in Advances in Neural Information Processing Systems 31 (NeurIPS 2018). This paper forms part of an expanding body of work that provides mathematical convergence analysis of probabilistic solvers for initial value problems, in this case implicit methods such as (probabilistic versions of) the multistep Adams–Moulton method.

O. Teymur, H. C. Lie, T. J. Sullivan, and B. Calderhead. “Implicit probabilistic integrators for ODEs” in Advances in Neural Information Processing Systems 31 (NIPS 2018), ed. S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett. 7244–7253, 2018. http://papers.nips.cc/paper/7955-implicit-probabilistic-integrators-for-odes

Abstract. We introduce a family of implicit probabilistic integrators for initial value problems (IVPs), taking as a starting point the multistep Adams–Moulton method. The implicit construction allows for dynamic feedback from the forthcoming time-step, in contrast to previous probabilistic integrators, all of which are based on explicit methods. We begin with a concise survey of the rapidly-expanding field of probabilistic ODE solvers. We then introduce our method, which builds on and adapts the work of Conrad et al. (2016) and Teymur et al. (2016), and provide a rigorous proof of its well-definedness and convergence. We discuss the problem of the calibration of such integrators and suggest one approach. We give an illustrative example highlighting the effect of the use of probabilistic integrators — including our new method — in the setting of parameter inference within an inverse problem.

Published on Thursday 13 December 2018 at 12:00 UTC #publication #nips #neurips #prob-num #lie #teymur #calderhead