Tim Sullivan

Welcome!

I am Junior Professor in Applied Mathematics with Specialism in Risk and Uncertainty Quantification at the Free University of Berlin and Research Group Leader for Uncertainty Quantification at the Zuse Institute Berlin. I have wide interests in uncertainty quantification the broad sense, understood as the meeting point of numerical analysis, applied probability and statistics, and scientific computation. On this site you will find information about how to contact me, my research, publications, and teaching activities.

SFB 1294

Kalman Lecture by Andrew Stuart at SFB1294

Andrew Stuart (Caltech) will give the inaugural Kalman Lecture of SFB 1294 Data Assimilation on the topic of “Large Graph Limits of Learning Algorithms”.

Time and Place. Friday 24 August 2018, 10:15–11:45, University of Potsdam, Campus Golm, Building 27, Lecture Hall 0.01

Abstract. Many problems in machine learning require the classification of high dimensional data. One methodology to approach such problems is to construct a graph whose vertices are identified with data points, with edges weighted according to some measure of affinity between the data points. Algorithms such as spectral clustering, probit classification and the Bayesian level set method can all be applied in this setting. The goal of the talk is to describe these algorithms for classification, and analyze them in the limit of large data sets. Doing so leads to interesting problems in the calculus of variations, in stochastic partial differential equations and in Monte Carlo Markov Chain, all of which will be highlighted in the talk. These limiting problems give insight into the structure of the classification problem, and algorithms for it.

Published on Friday 3 August 2018 at 11:00 UTC #event #sfb1294

University of Oulu

Summer School / Workshop on Computational Mathematics and Data Science

Next month, 22–24 August 2018, along with Matt Dunlop (Helsinki), Tapio Helin (Helsinki), and Simo Särkkä (Aalto), I will be giving guest lectures at a Summer School / Workshop on Computational Mathematics and Data Science at the University of Oulu, Finland.

While the the other lecturers will treat aspects such as machine learning using deep Gaussian processes, filtering, and MAP estimation, my lectures will tackle the fundamentals of the Bayesian approach to inverse problems in the function-space context, as increasingly demanded by modern applications.

“Well-posedness of Bayesian inverse problems in function spaces: analysis and algorithms”

The basic formalism of the Bayesian method is easily stated, and appears in every introductory probability and statistics course: the posterior probability is proportional to the prior probability times the likelihood. However, for inference problems in high or even infinite dimension, the Bayesian formula must be carefully formulated and its stability properties mathematically analysed. The paradigm advocated by Andrew Stuart and collaborators since 2010 is that one should study the infinite-dimensional Bayesian inverse problem directly and delay discretisation until the last moment. These lectures will study the role of various choices of prior distribution and likelihood and how they lead to well-posed or ill-posed Bayesian inverse problems. If time permits, we will also consider the implications for algorithms, and how Bayesian posterior are summarised (e.g. by maximum a posteriori estimators).

Published on Saturday 21 July 2018 at 07:30 UTC #event #inverse-problems

Equivalence of weak and strong modes of measures on topological vector spaces

Preprint: Weak and strong modes

Han Cheng Lie and I have just uploaded a revised preprint of our paper, “Equivalence of weak and strong modes of measures on topological vector spaces”, to the arXiv. This addresses a natural question in the theory of modes (or maximum a posteriori estimators, in the case of posterior measure for a Bayesian inverse problem) in an infinite-dimensional space \(X\). Such modes can be defined either strongly (a la Dashti et al. (2013), via a global maximisation) or weakly (a la Helin & Burger (2015), via a dense subspace \(E \subset X\)). The question is, when are strong and weak modes equivalent? The answer turns out to be rather subtle: under reasonable uniformity conditions, the two kinds of modes are indeed equivalent, but finite-dimensional counterexamples exist when the uniformity conditions fail.

Abstract. A strong mode of a probability measure on a normed space \(X\) can be defined as a point \(u \in X\) such that the mass of the ball centred at \(u\) uniformly dominates the mass of all other balls in the small-radius limit. Helin and Burger weakened this definition by considering only pairwise comparisons with balls whose centres differ by vectors in a dense, proper linear subspace \(E\) of \(X\), and posed the question of when these two types of modes coincide. We show that, in a more general setting of metrisable vector spaces equipped with non-atomic measures that are finite on bounded sets, the density of \(E\) and a uniformity condition suffice for the equivalence of these two types of modes. We accomplish this by introducing a new, intermediate type of mode. We also show that these modes can be inequivalent if the uniformity condition fails. Our results shed light on the relationships between among various notions of maximum a posteriori estimator in non-parametric Bayesian inference.

Published on Monday 9 July 2018 at 08:00 UTC #publication #preprint #inverse-problems #modes #map-estimator

ECMath Colloquium

ECMath Colloquium

This week's colloquium at the Einstein Center for Mathematics Berlin will be on the topic of “Stochastics meets PDE.” The speakers will be:

  • Antoine Gloria (Sorbonne): Stochastic homogenization: regularity, oscillations, and fluctuations
  • Peter Friz (TU Berlin and WIAS Berlin): Rough Paths, Stochastics and PDEs
  • Nicholas Dirr (Cardiff): Interacting Particle Systems and Gradient Flows

Time and Place. Friday 6 July 2018, 14:00–17:00, Humboldt-Universität zu Berlin, Main Building Room 2094, Unter den Linden 6, 10099 Berlin.

Published on Monday 2 July 2018 at 12:00 UTC #event

SIAM UQ18

SIAM UQ18 in Garden Grove

The fourth SIAM Conference on Uncertainty Quantification (SIAM UQ18) will take place at the Hyatt Regency Orange County, Garden Grove, California, this week, 16–19 April 2018.

As part of this conference, Mark Girolami, Philipp Hennig, Chris Oates and I will organise a mini-symposium on “Probabilistic Numerical Methods for Quantification of Discretisation Error” (MS4, MS17 and MS32).

Published on Saturday 14 April 2018 at 08:00 UTC #event