Page 54 - 8th European Congress of Mathematics ∙ 20-26 June 2021 ∙ Portorož, Slovenia ∙ Book of Abstracts
P. 54
INVITED SPEAKERS
Positive harmonic functions on the Heisenberg group
Yves Benoist, yves.benoist@u-psud.fr
Université Paris-Sud, France
A harmonic function on a group is a function which is equal to the average of its translates,
average with respect to a finitely supported measure. First, we will survey the history of this
notion.
Then we will describe the extremal non-negative harmonic functions on the Heisenberg
group. We will see that the classical partition function occurs as such a function and that this
function is the only one beyond harmonic characters.
From Kähler-Einstein metrics to zeros of zeta functions
Robert Berman, robertb@chalmers.se
Chalmers Univ. of Technology, Sweden
While the existence of a unique Kähler-Einstein metrics on a canonically polarized manifold X
was established already in the seventies there are very few explicit formulas available (even in
the case of complex curves!). In this talk I will give a non-technical introduction to a probabilis-
tic construction of Kähler-Einstein metrics, which, in particular, yields canonical approxima-
tions of the Kähler-Einstein metric on X. The approximating metrics in question are expressed
as explicit period integrals and the conjectural extension to the case of a Fano variety leads to
some intriguing connections to zeros of some Archimedean zeta functions.
Regularization methods in inverse problems and machine learning
Martin Burger, martin.burger@fau.de
FAU Erlangen-Nürnberg, Germany
Regularization methods are at the heart of the solution of inverse problems and are of increasing
importance in modern machine learning. In this talk we will discuss the modern theory of
(nonlinear) regularization methods and some applications. We will put a particular focus on
variational and iterative regularization methods and their connection with learning problems:
we discuss the use of such regularization methods for learning problems on the one hand, but
also the current route of learning regularization methods from data.
From linear to nonlinear n-width : optimality in reduced modelling
Albert Cohen, cohen@ann.jussieu.fr
Sorbonne Université, France
The concept of n-width has been introduced by Kolmogorov as a way of measuring the size of
compact sets in terms of their approximability by linear spaces. From a numerical perspective
it may be thought as a benchmark for the performance of algorithms based on linear approx-
imation. In recent years this concept has proved to be highly meaningful in the analysis of
52
Positive harmonic functions on the Heisenberg group
Yves Benoist, yves.benoist@u-psud.fr
Université Paris-Sud, France
A harmonic function on a group is a function which is equal to the average of its translates,
average with respect to a finitely supported measure. First, we will survey the history of this
notion.
Then we will describe the extremal non-negative harmonic functions on the Heisenberg
group. We will see that the classical partition function occurs as such a function and that this
function is the only one beyond harmonic characters.
From Kähler-Einstein metrics to zeros of zeta functions
Robert Berman, robertb@chalmers.se
Chalmers Univ. of Technology, Sweden
While the existence of a unique Kähler-Einstein metrics on a canonically polarized manifold X
was established already in the seventies there are very few explicit formulas available (even in
the case of complex curves!). In this talk I will give a non-technical introduction to a probabilis-
tic construction of Kähler-Einstein metrics, which, in particular, yields canonical approxima-
tions of the Kähler-Einstein metric on X. The approximating metrics in question are expressed
as explicit period integrals and the conjectural extension to the case of a Fano variety leads to
some intriguing connections to zeros of some Archimedean zeta functions.
Regularization methods in inverse problems and machine learning
Martin Burger, martin.burger@fau.de
FAU Erlangen-Nürnberg, Germany
Regularization methods are at the heart of the solution of inverse problems and are of increasing
importance in modern machine learning. In this talk we will discuss the modern theory of
(nonlinear) regularization methods and some applications. We will put a particular focus on
variational and iterative regularization methods and their connection with learning problems:
we discuss the use of such regularization methods for learning problems on the one hand, but
also the current route of learning regularization methods from data.
From linear to nonlinear n-width : optimality in reduced modelling
Albert Cohen, cohen@ann.jussieu.fr
Sorbonne Université, France
The concept of n-width has been introduced by Kolmogorov as a way of measuring the size of
compact sets in terms of their approximability by linear spaces. From a numerical perspective
it may be thought as a benchmark for the performance of algorithms based on linear approx-
imation. In recent years this concept has proved to be highly meaningful in the analysis of
52