Tim Sullivan

#event

Clear Search

Online Probabilistic Numerics Minisymposia

Like many international conferences, the SIAM Conference on Uncertainty Quantification planned for 24–27 March 2020 had to be postponed indefinitely in view of the Covid-19 pandemic. Undeterred by this, the speakers of four minisymposia on the theme of Probabilistic Numerical Methods have generously taken the time to adapt their talks for a new medium and record them for general distribution. The talks can be found at http://probabilistic-numerics.org/meetings/SIAMUQ2020/.

We hope that these talks will be of general interest. Furthermore, the speakers have declared themselves ready to answer questions in written form. If you would like to ask any questions or contribute to the discussion, then please submit your question via this form by 10 May 2020.

Organised jointly by Alex Diaz, Alex Geßner, Philipp Hennig, Toni Karvonen, Chris Oates, and myself

Published on Friday 24 April 2020 at 10:00 UTC #event #siam #prob-num #diaz #gessner #hennig #karvonen #oates

SIAM UQ20 in Munich

The 2020 SIAM conference on Uncertainty Quantification (UQ20) will take place from 24 to 27 March 2020, on the Garching campus (near Munich) of the Technical University of Munich (TUM), Germany. UQ20 is being organised in cooperation with the GAMM Activity Group on UQ.

The website for UQ20 is now live and the call for submissions is open.

More information about the scientific programme will be added in due course, but the following scientists are already confirmed as plenary speakers:

Published on Monday 8 July 2019 at 12:00 UTC #event #siam

BMS Summer School 2019 Mathematics of Deep Learning

BMS Summer School on Mathematics of Deep Learning

This summer the Berlin Mathematical School will be offering the BMS Summer School 2019 on “Mathematics of Deep Learning”, 19–30 August 2019, at the Zuse Institute Berlin.

This summer school is aimed at graduate students in mathematics; postdocs are also encouraged to attend. It will offer lectures on both the theory of deep neural networks, and related questions such as generalization, expressivity, or explainability, as well as on applications of deep neural networks (e.g. to PDEs, inverse problems, or specific real-world problems).

The first week will be devoted to the theory of deep neural networks, while the second week has a focus on applications. The format is dominated by 1.5-hour lectures by international experts. In addition, there will also be a poster session for the participants.

Speakers include: Taco Cohen (Qualcomm), Francois Fleuret (IDIAP | EPF, Lausanne), Eldad Haber (University of British Columbia), Robert Jenssen (Tromso), Andreas Krause (ETH Zurich), Gitta Kutyniok (TU Berlin), Ben Leimkuhler (U Edinburgh), Klaus-Robert Müller (TU Berlin), Frank Noe (FU Berlin), Christof Schütte (FU Berlin | ZIB), Vladimir Spokoiny (HU Berlin | WIAS), Rene Vidal (Johns Hopkins University).

For more information, see www.mathplus.de and www.math-berlin.de/academics/summer-schools. The deadline for application is 8 April 2019.

Published on Sunday 24 March 2019 at 08:00 UTC #event #bms #deep-learning

Mathematisches Forschungsinstitut Oberwolfach

UQ at Oberwolfach

Last week, from 11 to 15 March 2019, the Mathematisches Forschungsinstitut Oberwolfach hosted its first full-size workshop on Uncertainty Quantification, organised by Oliver Ernst, Fabio Nobile, Claudia Schillings, and myself. This intensive and invigorating workshop brought together over fifty researchers in mathematics, statistics, and computational science from around the globe.

Photographs from the workshop can be found in the Oberwolfach Photo Collection.

Published on Saturday 16 March 2019 at 17:00 UTC #event #mfo #oberwolfach #ernst #nobile #schillings

SFB 1294

Kalman Lecture by Andrew Stuart at SFB1294

Andrew Stuart (Caltech) will give the inaugural Kalman Lecture of SFB 1294 Data Assimilation on the topic of “Large Graph Limits of Learning Algorithms”.

Time and Place. Friday 24 August 2018, 10:15–11:45, University of Potsdam, Campus Golm, Building 27, Lecture Hall 0.01

Abstract. Many problems in machine learning require the classification of high dimensional data. One methodology to approach such problems is to construct a graph whose vertices are identified with data points, with edges weighted according to some measure of affinity between the data points. Algorithms such as spectral clustering, probit classification and the Bayesian level set method can all be applied in this setting. The goal of the talk is to describe these algorithms for classification, and analyze them in the limit of large data sets. Doing so leads to interesting problems in the calculus of variations, in stochastic partial differential equations and in Monte Carlo Markov Chain, all of which will be highlighted in the talk. These limiting problems give insight into the structure of the classification problem, and algorithms for it.

Published on Friday 3 August 2018 at 11:00 UTC #event #sfb1294 #stuart