Quaderni MOX
Pubblicazioni
del Laboratorio di Modellistica e Calcolo Scientifico MOX. I lavori riguardano prevalentemente il campo dell'analisi numerica, della statistica e della modellistica matematica applicata a problemi di interesse ingegneristico. Il sito del Laboratorio MOX è raggiungibile
all'indirizzo mox.polimi.it
Trovati 1238 prodotti
-
43/2024 - 15/06/2024
Antonietti, P.F.; Corti, M., Martinelli, G.
Polytopal mesh agglomeration via geometrical deep learning for three-dimensional heterogeneous domains | Abstract | | Agglomeration techniques are important to reduce the computational costs of numerical simulations and stand at the basis of multilevel algebraic solvers. To automatically perform the agglomeration of polyhedral grids, we propose a novel Geometrical Deep Learning-based algorithm that can exploit the geometrical and physical information of the underlying computational domain to construct the agglomerated grid and simultaneously guarantee the agglomerated grid's quality. In particular, we propose a bisection model based on Graph Neural Networks (GNNs) to partition a suitable connectivity graph of computational three-dimensional meshes. The new approach has a high online inference speed and can simultaneously process the graph structure of the mesh, the geometrical information of the mesh (e.g. elements' volumes, centers' coordinates), and the physical information of the domain (e.g. physical parameters). Taking advantage of this new approach, our algorithm can agglomerate meshes of a domain composed of heterogeneous media in an automatic way. The proposed GNN techniques are compared with the k-means algorithm and METIS: standard approaches for graph partitioning that are meant to process only the connectivity information on the mesh. We demonstrate that the performance of our algorithms outperforms available approaches in terms of quality metrics and runtimes. Moreover, we demonstrate that our algorithm also shows a good level of generalization when applied to more complex geometries, such as three-dimensional geometries reconstructed from medical images. Finally, the capabilities of the model in performing agglomeration of heterogeneous domains are tested in the framework of problems containing microstructures and on a complex geometry such as the human brain. |
-
42/2024 - 08/06/2024
Fois, M.; Katili M. A.; de Falco C.; Larese A.; Formaggia L.
Landslide run-out simulations with depth-averaged models and integration with 3D impact analysis using the Material Point Method | Abstract | | Landslides pose a significant threat to human safety and the well-being of communities, making them one of the most challenging natural phenomena. Their potential for catastrophic consequences, both in terms of human lives and economic impact, is a major concern. Additionally, their inherent unpredictability adds to the complexity of managing the risks associated with landslides. It is crucial to continuously monitor areas susceptible to landslides. In situ detection systems like piezometers and strain gauges play a vital role in accurately monitoring internal pressures and surface movements in the targeted areas. Simultaneously, satellite surveys contribute by offering detailed topographic and elevation data for the study area. However, relying solely on empirical monitoring is insufficient for ensuring effective management of hazardous situations, especially in terms of preventive measures. This study provides advanced simulations of mudflows and fast landslides using particle depth-averaged methods, specifically employing the Material Point Method adapted for shallow water (Depth Averaged Material Point Method). The numerical method has been parallelized and validated through benchmark tests and real-world cases. Furthermore, the investigation extends to coupling the depth-averaged formulation with a three-dimensional one in order to have a detailed description of the impact phase of the sliding material on barriers and membranes. The multidimensional approach and its validation on real cases provide a robust foundation for a more profound and accurate understanding of the behavior of mudflows and fast landslides. |
-
41/2024 - 05/06/2024
Bergonzoli, G.; Rossi, L.; Masci, C.
Ordinal Mixed-Effects Random Forest | Abstract | | We propose an innovative statistical method, called Ordinal Mixed-Effect Random Forest (OMERF), that extends the use of random forest to the analysis of hierarchical data and ordinal responses. The model preserves the flexibility and ability of modeling complex patterns of both categorical and continuous variables, typical of tree-based ensemble methods, and, at the same time, takes into account the structure of hierarchical data, modeling the dependence structure induced by the grouping and allowing statistical inference at all data levels. A simulation study is conducted to validate the performance of the proposed method and to compare it to the one of other state-of-the art models. The application of OMERF is exemplified in a case study focusing on predicting students performances using data from the Programme for International Student Assessment (PISA) 2022. The model identifies discriminating student characteristics and estimates the school-effect. |
-
40/2024 - 30/05/2024
Carrara, D.; Regazzoni, F.; Pagani, S.
Implicit neural field reconstruction on complex shapes from scattered and noisy data | Abstract | | Reconstructing distributed physical quantities from scattered sensor data is a challenging task due to geometric and measurement uncertainties. We propose a novel machine learning based framework that allows to implicitly represent geometries solely from noisy surface points, by training a neural network model in a semi-supervised manner. We investigate different combinations of regularizing terms for the loss function, including a differential one based on the Eikonal equation, thus ensuring that the level set function approximates a signed distance function without the need for preprocessing data. This makes the method suitable for realistic, noise-corrupted, and sparse data. Furthermore, our approach leverages neural networks to predict distributed quantities defined on surfaces, while ensuring geometrical compatibility with the underlying implicit geometry representation. Our approach allows for accurate reconstruction of derived quantities such as surface gradients by relying on automatic differentiation tools. Comprehensive tests on synthetic data validate the method's efficacy, which demonstrate its potential for significant applications in healthcare. |
-
39/2024 - 22/05/2024
Bartsch, J.; Buchwald, S.; Ciaramella, G.; Volkwein, S.
Reconstruction of unknown nonlienar operators in semilinear elliptic models using optimal inputs | Abstract | | Physical models often contain unknown functions and relations. The goal of our work is to answer the question of how one should excite or control a system under consideration in an appropriate way to be able to reconstruct an unknown nonlinear relation. To answer this question, we propose a greedy reconstruction algorithm within an offline-online strategy. We apply this strategy to a two-dimensional semilinear elliptic model. Our identification is based on the application of several space-dependent excitations (also called controls). These specific controls are designed by the algorithm in order to obtain a deeper insight into the underlying physical problem and a more precise reconstruction of the unknown relation. We perform numerical simulations that demonstrate the effectiveness of our approach which is not limited to the current type of equation. Since our algorithm provides not only a way to determine unknown operators by existing data but also protocols for new experiments, it is a holistic concept to tackle the problem of improving physical models. |
-
38/2024 - 20/05/2024
Tonini, A., Regazzoni, F., Salvador, M., Dede', L., Scrofani, R., Fusini, L., Cogliati, C., Pontone, G., Vergara, C., Quarteroni, A.
Two new calibration techniques of lumped-parameter mathematical models for the cardiovascular system | Abstract | | Cardiocirculatory mathematical models serve as valuable tools for investigating physiological and pathological conditions of the circulatory system. To investigate the clinical condition of an individual, cardiocirculatory models need to be personalized
by means of calibration methods. In this study we propose a new calibration method for a lumped-parameter cardiocirculatory model. This calibration method utilizes the correlation matrix between parameters and model outputs to calibrate the latter according to data. We test this calibration method and its combination with L-BFGS-B (Limited
memory Broyden – Fletcher – Goldfarb – Shanno with Bound constraints) comparing them with the performances of L-BFGS-B alone. We show that the correlation matrix calibration method and the combined one effectively reduce the loss function of the associated
optimization problem. In the case of in silico generated data, we show that the two new calibration methods are robust with respect to the initial guess of parameters and to the presence of noise in the data. Notably, the correlation matrix calibration method achieves the best results in estimating the parameters in the case of noisy data
and it is faster than the combined calibration method and L-BFGS-B. Finally, we present real test case where the two new calibration methods yield results comparable to those obtained using L-BFGS-B in terms of minimizing the loss function and estimating the clinical data. This highlights the effectiveness of the new calibration methods for clinical applications. |
-
37/2024 - 30/04/2024
Begu, B.; Panzeri, S.; Arnone, E.; Carey, M.; Sangalli, L.M.
A nonparametric penalized likelihood approach to density estimation of space-time point patterns | Abstract | |
In this work, we consider space-time point processes and study their continuous space-time evolution. We propose an innovative nonparametric methodology to estimate the unknown space-time density of the point pattern, or, equivalently, to estimate the intensity of an inhomogeneous space-time Poisson point process. The presented approach combines maximum likelihood estimation with roughness penalties, based on differential operators, defined over the spatial and temporal domains of interest. We first establish some important theoretical properties of the considered estimator, including its consistency. We then develop an efficient and flexible estimation procedure that leverages advanced numerical and computation techniques. Thanks to a discretization based on finite elements in space and B–splines in time, the proposed method can effectively capture complex multi-modal and strongly anisotropic spatio-temporal point patterns; moreover, these point patterns may be observed over planar or curved domains with non-trivial geometries, due to geographic constraints, such as coastal regions with complicated shorelines, or curved regions with complex orography. In addition to providing estimates, the method’s functionalities also include the introduction of appropriate uncertainty quantification tools. We thoroughly validate the proposed method, by means of simulation studies and applications to real-world data. The obtained results highlight significant advantages over state-of-the-art competing approaches.
|
-
36/2024 - 29/04/2024
Torri, V.; Ercolanoni, M.; Bortolan, F.; Leoni, O.; Ieva, F.
A NLP-based semi-automatic identification system for delays in follow-up examinations: an Italian case study on clinical referrals | Abstract | | Background: This study aims to propose a semi-automatic method for monitoring the waiting times of follow-up examinations within the National Health System (NHS) in Italy, which is currently not possible to due the absence of the necessary structured information in the official databases.
Methods: A Natural Language Processing (NLP) based pipeline has been developed to extract the waiting time information from the text of referrals for follow-up examinations in the Lombardy Region. A manually annotated dataset of 10 000 referrals has been used to develop the pipeline and another manually annotated dataset of 10 000 referrals has been used to test its performance. Subsequently, the pipeline has been used to analyze all 12 million referrals prescribed in 2021 and performed by May 2022 in the Lombardy Region.
Results: The NLP-based pipeline exhibited high precision (0.999) and recall (0.973) in identifying waiting time information from referrals’ texts, with high accuracy in normalization (0.948-0.998). The overall reporting of timing indications in referrals’ texts for follow-up examinations was low (2%), showing notable variations across medical disciplines and types of prescribing physicians. Among the referrals reporting waiting times, 16% experienced delays (average delay = 19 days, standard deviation = 34 days), with significant differences observed across medical disciplines and geographical areas.
Conclusions: The use of NLP proved to be a valuable tool for assessing waiting times in follow-up examinations, which are particularly critical for the NHS due to the significant impact of chronic diseases, where follow-up exams are pivotal. Health authorities can exploit this tool to monitor the quality of NHS services and optimize resource allocation. |
|