Skip to main content
To KTH's start page

Scientific Machine Learning for Forward and Inverse Problems

Physics-Informed Neural Networks and Machine Learning Algorithms with Applications to Dynamical Systems

Time: Mon 2025-05-26 13.00

Location: Kollegiesalen, Brinellvägen 8, Stockholm

Video link: https://kth-se.zoom.us/j/66482272586

Language: English

Subject area: Computer Science

Doctoral student: Federica Bragone , Beräkningsvetenskap och beräkningsteknik (CST)

Opponent: Adjunct Professor Marta D'Elia, Stanford University, Stanford CA, USA

Supervisor: Professor Stefano Markidis, Beräkningsvetenskap och beräkningsteknik (CST); Doctor Kateryna Morozovska, Reglerteknik; Tor Laneryd, Hitachi Energy, Västerås, Sweden; Michele Luvisotto, Hitachi Energy, Västerås, Sweden

Export to calendar

QC 20250505

Abstract

Scientific Machine Learning (SciML) is a promising field that combines data-driven models with physical laws and principles. A novel example is the application of Artificial Neural Networks (ANNs) to solve Ordinary Differential Equations (ODEs) and Partial Differential Equations (PDEs). One of the most recent approaches in this area is Physics-Informed Neural Networks (PINNs), which encode the governing physical equations directly into the neural network architecture. PINNs can solve both forward and inverse problems, learning the solution to differential equations and inferring unknown parameters or even functional forms. Therefore, they are particularly effective when partially known equations or incomplete models describe real-world systems. 

Differential equations enable a mathematical formulation for various fundamental physical laws. ODEs and PDEs are used to model the behavior of complex and dynamical systems in many fields of science. However, many real-world problems are either too complex to solve exactly or involve equations that are not fully known. In these cases, we rely on numerical methods to approximate solutions. While these methods can be very accurate, they often are computationally expensive, especially for large, nonlinear, or high-dimensional problems. Therefore, exploring alternative approaches like SciML to find more efficient and scalable solutions is fundamental.

This thesis presents a series of applications of SciML methods in identifying and solving real-world systems. First, we demonstrate using PINNs combined with symbolic regression to recover governing equations from sparse observational data, focusing on cellulose degradation within power transformers. PINNs are then applied to solve forward problems, specifically the 1D and 2D heat diffusion equations, which model thermal distribution in transformers. Moreover, we also develop an approach for optimal sensor placement using PINNs that improves data collection efficiency. A third case study examines how dimensionality reduction techniques, such as Principal Component Analysis (PCA), can be applied to explain and visualize high-dimensional data, where each observation comprises a large number of variables that describe physical systems. Using datasets on Cellulose Nanofibrils (CNFs) of various materials and concentrations, Machine Learning (ML) techniques are employed to characterize and interpret the system behavior. 

The second part of this thesis focuses on improving the scalability and robustness of PINNs. We propose a pretraining strategy that optimizes the initial weights, reducing stochasticity variability to address training instability and high computational costs in higher-dimensional problems arising from solving multi-dimensional or parametric PDEs. Moreover, we introduce an extension of PINNs, referred to as $PINN, which includes Bayesian probability within a domain decomposition framework. This formulation enhances performance, particularly in handling noisy data and multi-scale problems.

urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-363009