Tim Sullivan

Junior Professor in Applied Mathematics:
Risk and Uncertainty Quantification

Hans Kersting

UQ Talks: Hans Kersting

Next week Hans Kersting (MPI Tübingen) will give a talk in the UQ research seminar about “UQ in probabilistic ODE solvers”.

Time and Place. Tuesday 14 June 2016, 12:15–13:15, ZIB Seminar Room 2006, Zuse Institute Berlin, Takustrasse 7, 14195 Berlin

Abstract. In an ongoing push to construct probabilistic extensions of classic ODE solvers for application in statistics and machine learning, two recent papers have provided distinct methods that return probability measures instead of point estimates, based on sampling and filtering respectively. While both approaches leverage classical numerical analysis, by building on well-studied solutions of existing seminal solvers, the different constructions of probability measures strike a divergent balance between a formal quantification of epistemic uncertainty and a low computational overhead.

On the one hand, Conrad et al. proposed to randomise existing non-probabilistic one-step solvers by adding suitably scaled Gaussian noise after every step and thereby inducing a probability measure over the solution space of the ODE which contracts to a Dirac measure on the true unknown solution in the order of convergence of the underlying classic numerical method. But the computational cost of these methods is significantly above that of classic solvers.

On the other hand, Schober et al. recast the estimation of the solution as state estimation by a Gaussian (Kalman) filter and proved that employing a integrated Wiener process prior returns a posterior Gaussian process whose maximum likelihood (ML) estimate matches the solution of classic Runge–Kutta methods. In an attempt to amend this method's rough uncertainty calibration while sustaining its negligible cost overhead, we propose a novel way to quantify uncertainty in this filtering framework by probing the gradient using Bayesian quadrature.

Published on Monday 6 June 2016 at 11:00 UTC #event #uq-talk

Probabilistic meshless methods for partial differential equations and Bayesian inverse problems

Preprint: Probabilistic meshless methods for PDEs and BIPs

Jon Cockayne, Chris Oates, Mark Girolami and I have just uploaded a preprint of our latest paper, “Probabilistic meshless methods for partial differential equations and Bayesian inverse problems” to the arXiv. This paper forms part of the push for probabilistic numerics in scientific computing.

Abstract. This paper develops a class of meshless methods that are well-suited to statistical inverse problems involving partial differential equations (PDEs). The methods discussed in this paper view the forcing term in the PDE as a random field that induces a probability distribution over the residual error of a symmetric collocation method. This construction enables the solution of challenging inverse problems while accounting, in a rigorous way, for the impact of the discretisation of the forward problem. In particular, this confers robustness to failure of meshless methods, with statistical inferences driven to be more conservative in the presence of significant solver error. In addition, (i) a principled learning-theoretic approach to minimise the impact of solver error is developed, and (ii) the challenging setting of inverse problems with a non-linear forward model is considered. The method is applied to parameter inference problems in which non-negligible solver error must be accounted for in order to draw valid statistical conclusions.

Published on Thursday 26 May 2016 at 10:00 UTC #publication #preprint

Well-posed Bayesian inverse problems and heavy-tailed stable Banach space priors

Preprint: Bayesian inversion with heavy-tailed stable priors

Just uploaded to the arXiv: “Well-posed Bayesian inverse problems and heavy-tailed stable Banach space priors”. This article builds on the function-space formulation of Bayesian inverse problems advocated by Stuart et al. to allow the prior to be heavy-tailed: not only may it not be exponentially integrable, as is the case for a Gaussian or Besov measure, it might not even have a well-defined mean, as in the case of the famous Cauchy distribution on \(\mathbb{R}\).

Abstract. This article extends the framework of Bayesian inverse problems in infinite-dimensional parameter spaces, as advocated by Stuart (Acta Numer. 19:451–559, 2010) and others, to the case of a heavy-tailed prior measure in the family of stable distributions, such as an infinite-dimensional Cauchy distribution, for which polynomial moments are infinite or undefined. It is shown that analogues of the Karhunen–Loève expansion for square-integrable random variables can be used to sample such measures. Furthermore, under weaker regularity assumptions than those used to date, the Bayesian posterior measure is shown to depend Lipschitz continuously in the Hellinger metric upon perturbations of the misfit function and observed data.

Published on Friday 20 May 2016 at 10:00 UTC #publication #preprint

Zuse 75

The Digital Future: 75th Anniversary of the Zuse Z3

11 May 2016 marks the seventy-fifth anniversary of the unveiling of Konrad Zuse's Z3 computer. The Z3 was the world's first working programmable, fully automatic digital computer.

In celebration of this landmark achievement in computational science, the Zuse Institute, the Berlin–Brandenburg Academy of Sciences, and Der Tagesspiegel are organising a conference on “The Digital Future: 75 Years Zuse Z3 and the Digital Revolution”. For further information, see www.zib.de/zuse75.

Published on Monday 2 May 2016 at 12:00 UTC #event

Steven Niederer

UQ Talks: Steven Niederer

This week Steven Niederer (King's College London) will talk about “Linking physiology and cardiology through mathematical models”

Time and Place. Thursday 28 April 2016, 11:00–12:00, Room 4027 of the Zuse Institute Berlin, Takustrasse 7, 14195 Berlin

Abstract. Much effort has gone into the analysis of cardiac function using mathematical and computational models. To fully realise the potential of these studies requires the translation of these models into clinical applications to aid in diagnosis and clinical planning.

To achieve this goal requires the integration of multiple disparate clinical data sets into a common modelling framework. To this end we have developed a coupled electro-mechanics model of the human heart. This model combines patient specific anatomical geometry, active contraction, electrophysiology, tissue heterogeneities and boundary conditions fitted to comprehensive imaging and catheter clinical measurements.

This multi-scale computational model allows us to link sub cellular mechanisms to whole organ function. This provides a novel tool to determine the mechanisms that underpin treatment out comes and offers the ability to determine hidden variables that provide new metrics of cardiac function. Specifically we report on the application of these methods in patients receiving cardiac resynchronisation therapy and ablation for atrial fibrillation.

Published on Sunday 24 April 2016 at 08:00 UTC #event #uq-talk

← Newer | 1 | 2 | 3 | 4 | Older →