Seminari periodici
DIPARTIMENTO DI MATEMATICA

SCUBE

Series of Semester Seminars organized at the Department of Mathematics, University of Bologna. The presentations mainly cover Numerical Linear Algebra problems and their wide range of applications. Broader contributions are very welcome.
Organizzato da: Davide Palitta and Valeria Simoncini

Seminari passati

Peridynamics is a nonlocal version of continuum mechanics theory able to incorporate singularities since it does not take into account spatial partial derivatives. As a consequence, it assumes long-range interactions among material particles and is able to describe the formation and the evolution of fractures. The discretization of such nonlocal model requires the use of raffinate numerical tools for approximating the solutions to the model. Due to the presence of a convolution product in the definition of the nonlocal operator, we propose a spectral collocation method based on the implementation of Fourier and Chebyshev polynomials to discretize the model. The choice can benefit of the FFT algorithm and allow us to deal efficiently with the imposition of non-periodic boundary conditions by a volume penalization technique. We prove the convergence of such methods in the framework of fractional Sobolev space and discuss numerically the stability of the scheme. We also investigate the qualitative aspects of the convolution kernel and of the nonlocality parameters by solving an inverse peridynamic problem by using a Physics-Informed Neural Network activated by suitable Radial Basis functions. Additionally, we propose a virtual element approach to obtain the solution of a nonlocal diffusion problem. The main feature of the proposed technique is that we are able to construct a nonlocal counterpart for the divergence operator in order to obtain a weak formulation of the peridynamic model and exploit the analogies with the known results in the context of Galerkin approximation. We prove the convergence of the proposed method and provide several simulations to validate our results. References: [1] Lopez, L., Pellegrino, S. F. (2021). A spectral method with volume penalization for a nonlinear peridynamic model International Journal for Numerical Methods in Engineering 122(3): 707–725. https://doi.org/10.1002/nme.6555 [2] Lopez, L., Pellegrino, S. F. (2022). A space-time discretization of a nonlinear peridynamic model on a 2D lamina Computers and Mathematics with Applications 116: 161–175. https://doi.org/10.1016/j.camwa.2021.07.0041 [3] Lopez, L., Pellegrino, S. F. (2022). A non-periodic Chebyshev spectral method avoiding penalization techniques for a class of nonlinear peridynamic models International Journal for Numerical Methods in Engineering 123(20): 4859–4876. https://doi.org/10.1002/nme.7058 [4] Difonzo, F. V., Lopez, L., Pellegrino, S. F. (2024). Physics informed neural networks for an inverse problem in peridynamic models Engineering with Computers. https://doi.org/10.1007/s00366-024-01957-5 [5] Difonzo, F. V., Lopez, L., Pellegrino, S. F. (2024). Physics informed neural networks for learning the horizon size in bond-based peridynamic models Computer Methods in Applied Mechanics and Engineering. https://doi.org/10.1016/j.cma.2024.117727
Accurately estimating landslides’ failure surface depth is essential for hazard prediction. However, most of the classical methods rely on overly simplistic assumptions [1]. In this work, we will present the landslide thickness estimation problem as an inverse problem Aw = b, obtained from discretization of the thickness equation [2]: ∂(hf vx)/∂x + ∂(hf vy)/∂y = − ∂ζ/∂t , (1) where the forward operator A contains information on the surface velocity (v_x, v_y), the right-hand side b corresponds to the surface elevation change ∂ζ/∂t, and w is the thickness hf . By employing a regularization approach, the inverse problem is reformulated as an optimization problem. In real-world scenarios, often no information on neither the noise type nor the noise level affecting data is available. In this context, the correct choice of the regularization parameter becomes a pressing issue. We propose a method to determine this parameter in a fully automatic way for the thickness inversion problem. Results obtained on both synthetic data generated by landslide simulation software and data measured from real-world landslides will be shown. [1] Jaboyedoff M., Carrea D., Derron M.H., Oppikofer T., Penna I.M., Rudaz B. (2020): A review of methods used to estimate initial landslide failure surface depths and volumes. Engineering Geology, 267, 105478 [2] Booth A. M. ; Lamb M. P. ; Avouac J.P. ; Delacourt C. (2013): Landslide velocity, thickness, and rheology from remote sensing: La Clapière landslide, France. Geophysical Research Letters, Vol. 40, 4299 - 4304.
In this seminar, I will talk about Objective Function Free Optimization (OFFO) in the context of pruning the parameter of a given model. OFFO algorithms are methods where the objective function is never computed; instead, they rely only on derivative information, thus on the gradient in the first-order case. I will give an overview of the main OFFO methods, focusing on adaptive algorithms such as Adagrad, Adam, RMSprop, ADADELTA, which are gradient methods that share the common characteristic of depending only on current and past gradient information to adaptively determine the step size at each iteration. Next, I will briefly discuss the most popular pruning approaches. As the name implies, pruning a model, typically a neural networks, refers to the process of reducing its size and complexity, typically by removing certain parameters that are considered unnecessary for its performance. Pruning emerges as an alternative compression technique for neural networks to matrix and tensor factorization or quantization. Mainly, I will focus on pruning-aware methods that uses specific rules to classify parameters as relevant or irrelevant at each iteration, enhancing convergence to a solution of the problem at hand, which is robust to pruning irrelevant parameters after training.Finally, I will introduce a novel deterministic algorithm which is both adaptive and pruning-aware, based on a modification Adagrad scheme that converges to a solution robust to pruning with complexity of $\log(k) \backslash k$. I will illustrate some preliminary results on different applications.
3D shape analysis tasks often involve characterizing a 3D object by an invariant, computationally efficient, and discriminative numerical representation, called shape descriptors. Among those, spectral-based shape descriptors have become increasingly widespread, since the spectrum is an isometry invariant, and thus is independent of the object’s representation including parametrization and spatial position[1]. However, large spectral decompositions and the choice of the most significant eigen-couples become computationally expensive for large set of data-points. We introduce a concise learning-based shape descriptor, computed through a Generalized Graph Neural Network (G-GNN) [2]. The G-GNN is an unsupervised graph neural network, leveraging spectral-based convolutional operators, derived from a learnable, energy-driven evolution process. Applied to a 3D polygonal mesh, the G-GNN allows to learn features acting as global shape descriptor of the 3D object. Using a 3D mesh related Dirichlet-like energy leads to a spectral and intrinsic shape descriptor, tied to the isometry invariant Laplace-Beltrami operator. Finally, by equipping the G-GNN with a suitable shape retrieval loss, the spectral shape descriptor can be employed in non-linear dimensionality reduction problems since it can define an optimal embedding, squeezing the latent information of a 3D model into a compact low-dimensional shape representation of the 3D object [1] Martin Reuter, Franz-Erich Wolter, Niklas Peinecke, Laplace–Beltrami spectra as ‘Shape-DNA’ of surfaces and solids, Computer-Aided Design, Volume 38, Issue 4, 2006, Pages 342-366, ISSN 0010-4485, https://doi.org/10.1016/j.cad.2005.10.011. [2] D. Lazzaro, S. Morigi, P. Zuzolo, Learning intrinsic shape representations via spectral mesh convolutions, Neurocomputing, Volume 598, 2024, 128152, ISSN 0925-2312, https://doi.org/10.1016/j.neucom.2024.128152.