Department of Mathematics

Vorträge in der Woche 12.01.2026 bis 18.01.2026


Vorherige Woche Nächste Woche Alle Vorträge

Mittwoch, 14.01.2026: Tropical phylogenetics

Shelby Cox (MPI Leipzig)

Phylogenetic reconstruction often yields many competing trees, due to both biological and methodological variation. The space of phylogenetic trees is not convex, which complicates attempts to compare these reconstructions. By embedding trees in the tropically convex space trop M_0,n, we obtain a geometric framework for analyzing collections of trees on the same set of leaves. In this talk, I’ll explain how tropical convexity enables the computation of weighted median trees, and outline future directions in tropical phylogenetics.

Uhrzeit: 10:15 - 11:15
Ort: S10
Gruppe: Oberseminar kombinatorische algebraische Geometrie
Einladender: Daniele Agostini, Hannah Markwig

Donnerstag, 15.01.2026: TBA

Ariadna León Quirós (Universität Tübingen)

TBA

Uhrzeit: 14:00
Ort: Seminarraum C4H33 and virtual via zoom, for zoom link please contact Martina Neu
Gruppe: Oberseminar Geometrische Analysis, Differentialgeometrie und Relativitätstheorie
Einladender: Carla Cederbaum, Gerhard Huisken, zusammen mit Jan Metzger (Potsdam)

Donnerstag, 15.01.2026: Stabilization of neural ODEs

Prof. Nicola Guglielmi(Gran Sasso Science Institute)

We propose a method to enhance the stability of a neural ordinary differential equation (neural ODE) by controlling the maximum error growth following a perturbation of the initial value. Since it is known that the bound depends on the logarithmic norm of the Jacobian matrix associated with the neural ODE, we tune this parameter by suitably perturbing the weight matrix of the neural ODE by the smallest possible perturbation (in Frobenius norm). We achieve this by solving an eigenvalue optimization problem, for which we propose a nested algorithm. For a given perturbation size of the weight matrix, the inner level computes an optimal perturbation of that size, while - at the external level - we tune the perturbation amplitude until we reach the desired uniform stability bound. We embed the proposed algorithm in the training of the neural ODE to improve its robustness to perturbations to the initial value, which might be due simply to noisy data but also to adversarial attacks on the neural classifier. Numerical experiments on the MNIST and FashionMNIST datasets show that an image classifier including a neural ODE in its architecture, trained according to our strategy, is more stable than the same classifier trained in the classical way, and therefore, it is more robust and less vulnerable to adversarial attacks. This is inspired by joint projects with A. De Marinis, A. Savostianov, S. Sicilia, and F. Tudisco.

Uhrzeit: 14:00
Ort: 7E02
Gruppe: Numerische Mathematik
Einladender: Christian Lubich

Freitag, 16.01.2026: Das Pontrjaginsche Maximumsprinzip

Joachim Steck (Universität Tübingen)

Uhrzeit: 14:15
Ort: C4H33
Gruppe: Oberseminar Differentialgeometrie und Topologie
Einladender: Loose, Bohle