Questo sito utilizza solo cookie tecnici per il corretto funzionamento delle pagine web e per il miglioramento dei servizi.
Se vuoi saperne di più o negare il consenso consulta l'informativa sulla privacy.
Proseguendo la navigazione del sito acconsenti all'uso dei cookie.
Se vuoi saperne di più o negare il consenso consulta l'informativa sulla privacy.
Proseguendo la navigazione del sito acconsenti all'uso dei cookie.
Prossimi seminari del Dipartimento di Matematica
Luglio
09
2025
Stefano Massei
Seminario di analisi numerica
ore
09:00
Luglio
09
2025
Michela Redivo-Zaglia
Seminario di analisi numerica
ore
10:00
Luglio
09
Mercoledì
Teresa Laudadio
Seminario di analisi numerica
ore
15:00
Luglio
10
Giovedì
Lorenzo Piccinini
Seminario di analisi numerica
ore
09:00
Luglio
10
Giovedì
Ivan Markovsky
Seminario di analisi numerica
ore
09:00
Luglio
11
Venerdì
Felipe Espreafico Guelerman
Seminario di algebra e geometria
ore
14:00
presso Seminario II
One of the most famous results in enumerative geometry is the fact that, over an algebraic closed field, there are exactly 27 lines on a smooth cubic surface. One may ask however, what happens if the field is not algebraically closed. Is there a way to get an « invariant count », i.e., a count that does not depend on the cubic? Over the reals, if one counts lines with signs, there are exactly 3 real lines. In general, using tools from A^1 homotopy theory from Morel and Voevodsky, we can assign a local index in the set of square classes of the field to each one of the lines, such the sum of them is invariant. In our work, we consider general hypersurfaces and give a geometrical interpretation for the local indices of lines, following ideas from Finishing and Khalarmov who worked on the real case. This is joint work with Stephen McKean and Sabrina Pauli.
Luglio
23
Mercoledì
Urte Adomaityte
nel ciclo di seminari: SEMINARS IN MATHEMATICAL PHYSICS AND BEYOND
Seminario di fisica matematica
ore
12:00
presso Seminario I
Understanding why some neural network minima generalize better than others is a fundamental challenge in deep learning. To analyse this question, we bridge two perspectives: the analysis of the geometric complexity of decision boundaries in input space and the spectral properties of the Hessian of the training loss in parameter space. We show that the top eigenvectors of the Hessian encode the decision boundary, with the number of spectral outliers correlating with its complexity, a finding consistent across datasets and architectures. This insight leads to a formulation of a proxy generalization measure based on alignment between training gradients and Hessian eigenvectors. Additionally, as the measure is blind to simplicity bias, we develop a novel margin estimation technique that, in combination with the generalization measure, helps analyse the generalisation capabilities of neural networks trained on toy and real datasets.
Luglio
24
Giovedì
Gian Paolo Leonardi
Seminario di fisica matematica, probabilità
ore
14:30
presso Aula III di via Selmi 2
Quantization is a key technique for reducing deep learning models' memory footprint and computational cost. However, traditional quantization methods are not supported by general theoretical results. Moreover, they typically overlook the role of the geometry induced on the parameter space by the structure of the model and by the training dynamics.
Our main theoretical finding consists of identifying an appropriate metric to be used when projecting weights onto a quantization grid after training. More precisely, we consider suitably scaled, over-parametrized deep neural networks with L layers, whose parameters are initialized as i.i.d. normal variables with zero mean and unit variance, and subsequently trained with gradient descent. Then, we rigorously prove that the natural quantization metric is defined by the Gauss-Newton seminorm whenever the final point of the training dynamics satisfies suitable sparsity assumptions. Specifically, we quantify the contribution of this seminorm with high probability over the initialization as the dimension of the parameter space becomes large enough.
Based on this theoretical result, we propose a novel, post-training quantization algorithm called GeoPTQ, which is shown to outperform classical quantization schemes in some preliminary experiments.
This is a joint work with Massimiliano Datres (LMU Munich) and Andrea Agazzi (Univ. Bern).
Agosto
20
Mercoledì
Agosto
28
Giovedì
Guozhen Lu
nell'ambito della serie: SEMINARI DI ANALISI MATEMATICA BRUNO PINI
Seminario di analisi matematica
ore
12:00
presso Aula Enriques
seminario on line •
collegamento al meeting
Settembre
dal giorno
01/09/2025
al giorno
05/09/2025
01/09/2025
al giorno
05/09/2025
Conference
da lunedì 01 settembre 2025
a venerdì 05 settembre 2025
Summer School sponsored by the CIME Foundation
Local expenses for speakers are paid by CIME
Trave expenses for speakers to be reimbursed using the funds of the ERC project DAT
Settembre
dal giorno
01/09/2025
al giorno
05/09/2025
01/09/2025
al giorno
05/09/2025
Conference
da lunedì 01 settembre 2025
a venerdì 05 settembre 2025
The bond between Physics and Artificial Intelligence has never been stronger, as underscored by last year's Nobel Prize, awarded to Hinton and Hopfield, whose discoveries were central topics throughout the conference.
Only a few months have passed since our first edition, yet this short time has been enough for major developments to emerge, drastically reshaping our world from political, social, and scientific perspectives.
This rapid progress demands that the scientific community work to explain a wide range of phenomena that lie at the heart of AI’s functioning but remain largely understood only at an empirical level. Despite this, AI's success is so remarkable that it is now embedded in numerous devices, many of which are used daily by most of us. Addressing fundamental theoretical questions and achieving a deeper understanding of the basic building blocks of machine learning models and training algorithms remains crucial.
Given the high-dimensional nature of real-world data and the vast number of tunable parameters in machine learning models, statistical physics and high-dimensional inference provide a natural framework. Since Gardner and Derrida’s seminal works on the perceptron in the 1980s, the interplay between disciplines has led to a wealth of innovative ideas and insights into the functioning of neural networks.
This multidisciplinary conference aims to bring together researchers from statistical physics, mathematical physics, and machine learning. Our goal is to provide diverse perspectives on key topics in contemporary machine learning, including:
- Associative memories, diffusion models, energy-based models.
- Representation learning and structured data modeling.
- Language modeling, self-supervised learning, reasoning, and alternative learning paradigms.
- Mathematical physics approaches to high-dimensional probability, spin glasses, and Boltzmann machines.
- Theoretical aspects of neural networks.
The event will take place from September 1st to 5th, 2025, in Roccella Jonica, Calabria (Italy).
Settembre
dal giorno
07/09/2025
al giorno
13/09/2025
07/09/2025
al giorno
13/09/2025
Conference
da domenica 07 settembre 2025
a sabato 13 settembre 2025
Settembre
08
Lunedì
Mateusz Rzepecki
nell'ambito della serie: LOGIC, CATEGORIES, AND APPLICATIONS SEMINAR
Seminario di logica
ore
14:00
presso Seminario II
Settembre
08
Lunedì
Johannes Rau
Seminario di algebra e geometria
ore
14:00
presso Aula Pincherle
Settembre
09
Martedì
Nadia Oudjane
nell'ambito della serie: STOCHASTICS AND APPLICATIONS - 2025
Seminario di finanza matematica, probabilità
ore
11:00
presso - Aula Da Stabilire -
seminario on line •
collegamento al meeting
With the massive integration of renewable energies (photovoltaic (PV) and wind power) into the power grid, new uncertainties are impacting the power balance. At the same time, advances in « smart » technologies and batteries offer new flexibilities with the possibility of controlling the consumption of a large number of electrical appliances (electric vehicle recharging, heat pumps, etc.). In this framework, a major technical challenge is to optimize the management of this large number of heterogeneous flexibilities distributed across the network to help in balancing the system. This constitutes a large scale optimization problem under uncertainties, which can benefit from mean-field approximations.
Settembre
12
Venerdì
Alexander Kuznetsov
Seminario di algebra e geometria
ore
12:00
presso - Aula Da Stabilire -
Consider a compact degeneration of curves, i.e., a flat
family of curves over a disc such that the central fiber is a 1-nodal
curve with two irreducible components and all other fibers are smooth.
I will explain how the family of derived categories of the smooth
fibers is related to a category glued from the derived categories
of the components of the central fiber. This is a joint work
in progress with Valery Alexeev.
Settembre
18
Giovedì
Settembre
18
Giovedì
Settembre
19
Venerdì
Luca Regis
Seminario di finanza matematica, interdisciplinare, probabilità
ore
14:30
presso - Aula Da Stabilire -
Ottobre
09
Giovedì
Moritz Egert
nell'ambito della serie: SEMINARI DI ANALISI MATEMATICA BRUNO PINI
Seminario di analisi matematica
ore
16:00
presso Aula Enriques
seminario on line •
collegamento al meeting
TBA