Bela Arwen


Master's Student


Bela is an excellent and motivated master's student who started working with us in June 2025, and is leading our research related to a machine-learning technique known as Transfer Learning.

What is transfer learning? It is a procedure that allows us to start from a well-trained machine-learning emulator in the standard model and generalize it beyond the standard model without requiring a full retraining. That enables this generalization to be done using only 30%-ish of the data vector needed in full retraining, saving us hundreds of thousands of CPU-hours. This is critical, especially because when doing research, we make many decisions that constantly require retraining the emulator.

One example happened when we we studying axion dark energy (arxiv:2510.14957) in collaboration with Prof. Wayne Hu and the superb Ph.D. student Rayne Liu. Suddenly, we found that our optical depth prior (tau parameter) was too restrictive, because we wanted to investigate what would happen if the low-ell EE Planck likelihood were removed from the analysis. We also wanted to include a case with spatial curvature to better deal with the tension between the angular diameter distance at redshift 0.8 from DESI and the CMB angular diameter distance at the redshift of recombination. Both instances required expensive training that consumed a million CPU-hours.

With Transfer learning, this can be significantly accelerated by an order of magnitude, according to our initial tests (see below).

Github
Full Screen
Exit Full Screen

Tools
Translate to