My research is focused on statistical learning theory and machine learning. In particular, I work on generalization bounds (in particular of the information-theoretic variety); the application of such bounds to neural networks and meta learning; as well as online learning and active learning.

I am currently a postdoc at University College London Centre for Artificial Intelligence, advised by Benjamin Guedj and supported by WASP. Prior to this, I completed my PhD at Chalmers University of Technology as part of the WASP Graduate School AI-track, supervised by Giuseppe Durisi, and studied Physics at the University of Gothenburg. Feel free to reach out in case you want to chat about any of this!

A more detailed description of the WASP project INNER: information theory of deep neural networks that I participated in during my PhD is available here.


  • Generalization Bounds: Perspectives from Information Theory and PAC-Bayes
    Fredrik Hellström, Giuseppe Durisi, Benjamin Guedj, Maxim Raginsky. Preprint, 2023. [arXiv]

Conference and journal papers

  • Generalization and Informativeness of Conformal Prediction
    Matteo Zecchin, Sangwoo Park, Osvaldo Simeone, Fredrik Hellström. ISIT, 2024. [arXiv]
  • Comparing Comparators in Generalization Bounds
    Fredrik Hellström, Benjamin Guedj. AISTATS, 2024. [arXiv], [AISTATS]
  • Adaptive Selective Sampling for Online Prediction with Experts
    Rui M. Castro, Fredrik Hellström, Tim van Erven. NeurIPS 2023. [arXiv], [NeurIPS], [Video]
  • Evaluated CMI Bounds for Meta Learning: Tightness and Expressiveness
    Fredrik Hellström, Giuseppe Durisi. NeurIPS 2022. [arXiv], [NeurIPS], [Video]
  • New Family of Generalization Bounds Using Samplewise Evaluated CMI
    Fredrik Hellström, Giuseppe Durisi. NeurIPS 2022. [arXiv], [NeurIPS], [Video]
  • Fast-rate loss bounds via conditional information measures with applications to neural networks
    Fredrik Hellström, Giuseppe Durisi. ISIT 2021. [arXiv], [IEEE]
  • Generalization bounds via information density and conditional information density
    Fredrik Hellström, Giuseppe Durisi. JSAIT 2020. [arXiv], [IEEE]. Note: correction available.
  • Generalization error bounds via mth central moments of the information density
    Fredrik Hellström, Giuseppe Durisi. ISIT 2020. [arXiv], [IEEE]
  • New constraints on inelastic dark matter from IceCube
    Riccardo Catena, Fredrik Hellström. JCAP 2018. [arXiv], [IOP]


  • Information-Theoretic Generalization Bounds: Tightness and Expressiveness
    Fredrik Hellström. PhD Thesis, 2023. [Available]
  • Data-Dependent PAC-Bayesian Bounds in the Random-Subset Setting with Applications to Neural Networks
    Fredrik Hellström, Giuseppe Durisi. Workshop on Information-Theoretic Methods for Rigorous, Responsible, and Reliable Machine Learning (ITR3) at ICML 2021. [Available]

Fredrik Hellström

Fredrik Hellström
University College London
London, United Kingdom

Google Scholar
Site template