Page 47 - 8th European Congress of Mathematics ∙ 20-26 June 2021 ∙ Portorož, Slovenia ∙ Book of Abstracts
P. 47
PLENARY SPEAKERS
purpose of this lecture is to illustrate this theory by focusing on the special case of Bernoulli
random matrices. Such matrices are particularly interesting as they represent the adjacency
matrix of Erdos-Renyi graphs.
The Mathematics of Deep Learning
Gitta Kutyniok, kutyniok@math.tu-berlin.de
Technische Uninversität Berlin, Germany
Despite the outstanding success of deep neural networks in real-world applications, ranging
from science to public life, most of the related research is empirically driven and a comprehen-
sive mathematical foundation is still missing. At the same time, these methods have already
shown their impressive potential in mathematical research areas such as imaging sciences, in-
verse problems, or numerical analysis of partial differential equations, sometimes by far outper-
forming classical mathematical approaches for particular problem classes.
The goal of this lecture is to first provide an introduction into this new vibrant research
area. We will then survey recent advances in two directions, namely the development of a
mathematical foundation of deep learning and the introduction of novel deep learning-based
approaches to solve inverse problems and partial differential equations.
Geometric Valuation Theory
Monika Ludwig, monika.ludwig@tuwien.ac.at
TU Wien, Austria
Valuations on compact convex sets in Rn play an active and prominent role in geometry. They
were critical in Dehn’s solution to Hilbert’s Third Problem in 1901. They are defined as fol-
lows. A function Z whose domain is a collection of sets S and whose co-domain is an Abelian
semigroup is called a valuation if
Z(K) + Z(L) = Z(K ∪ L) + Z(K ∩ L),
whenever K, L, K ∪ L, K ∩ L ∈ S.
The first classification result for valuations on the space of compact convex sets, Kn, in Rn
(where Kn is equipped with the topology induced by the Hausdorff metric) was established by
Blaschke.
Theorem 1 (Blaschke). A functional Z : Kn → R is a continuous, translation and SL(n)
invariant valuation if and only if there are c0, cn ∈ R such that
Z(K) = c0V0(K) + cnVn(K)
for every K ∈ Kn.
Probably the most famous result in the geometric theory of valuations is the Hadwiger char-
acterization theorem.
Theorem 2 (Hadwiger). A functional Z : Kn → R is a continuous and rigid motion invariant
valuation if and only if there are c0, . . . , cn ∈ R such that
45
purpose of this lecture is to illustrate this theory by focusing on the special case of Bernoulli
random matrices. Such matrices are particularly interesting as they represent the adjacency
matrix of Erdos-Renyi graphs.
The Mathematics of Deep Learning
Gitta Kutyniok, kutyniok@math.tu-berlin.de
Technische Uninversität Berlin, Germany
Despite the outstanding success of deep neural networks in real-world applications, ranging
from science to public life, most of the related research is empirically driven and a comprehen-
sive mathematical foundation is still missing. At the same time, these methods have already
shown their impressive potential in mathematical research areas such as imaging sciences, in-
verse problems, or numerical analysis of partial differential equations, sometimes by far outper-
forming classical mathematical approaches for particular problem classes.
The goal of this lecture is to first provide an introduction into this new vibrant research
area. We will then survey recent advances in two directions, namely the development of a
mathematical foundation of deep learning and the introduction of novel deep learning-based
approaches to solve inverse problems and partial differential equations.
Geometric Valuation Theory
Monika Ludwig, monika.ludwig@tuwien.ac.at
TU Wien, Austria
Valuations on compact convex sets in Rn play an active and prominent role in geometry. They
were critical in Dehn’s solution to Hilbert’s Third Problem in 1901. They are defined as fol-
lows. A function Z whose domain is a collection of sets S and whose co-domain is an Abelian
semigroup is called a valuation if
Z(K) + Z(L) = Z(K ∪ L) + Z(K ∩ L),
whenever K, L, K ∪ L, K ∩ L ∈ S.
The first classification result for valuations on the space of compact convex sets, Kn, in Rn
(where Kn is equipped with the topology induced by the Hausdorff metric) was established by
Blaschke.
Theorem 1 (Blaschke). A functional Z : Kn → R is a continuous, translation and SL(n)
invariant valuation if and only if there are c0, cn ∈ R such that
Z(K) = c0V0(K) + cnVn(K)
for every K ∈ Kn.
Probably the most famous result in the geometric theory of valuations is the Hadwiger char-
acterization theorem.
Theorem 2 (Hadwiger). A functional Z : Kn → R is a continuous and rigid motion invariant
valuation if and only if there are c0, . . . , cn ∈ R such that
45