Internship and thesis proposals
Generative models by Hamiltonian and non-reversible stochastic flows

Domaines
Statistical physics

Type of internship
Théorique, numérique
Description
Generative modeling aims to capture and sample complex high-dimensional data distributions, playing a central role in machine learning, Bayesian inference and computational physics. Among generative methods, Normalizing Flows (NF) have gained prominence by training neural networks to map a simple prior distribution onto the desired target distribution through a sequence of invertible transformations. However, traditional NF approaches often face significant computational bottlenecks, particularly in high dimensions, where calculating Jacobian determinants can be costly. Normalizing Hamiltonian Flows (NHFs) have emerged as a promising alternative, leveraging symplectic integration for volume-preserving transformations and enabling flexible neural network architectures. The internship project aims at further improving the flexibility of these methods but also investigating an alternative based on non-reversible flow implementations. This opportunity is open to Master’s students, including first-year students (Master 1), who are eager to explore advanced topics in generative modeling. For Master 2 students, the internship will aim to lay the foundation for a PhD project focused on the development and analysis of Hamiltonian and stochastic flow-based generative models through quantitative analytical and numerical approaches.

Contact
Manon Michel
Laboratory : LMBP - UMR 6620
Team : LMBP
Team Website
/ Thesis :    Funding :