Teaching

Cours Fil Noir Fleurance 2023

(33ème Festival d’Astronomie de Fleurance, 7 August 2023)

Last update: 01-08-2023

  • Cours (07-08-2023): La théorie des probabilités : la logique de la découverte scientifique slides
  • GitHub repository containing the Jupyter notebooks.


STFC Summer School on Data Intensive Science 2021

(Durham University, 13-17 September 2021)

Last update: 16-09-2021

  • Lecture (16-09-2021): Bayesian statistics, and some other aspects of probability theory slides
  • GitHub repository containing the Jupyter notebooks.


ICIC Data Analysis Workshop 2021

(Imperial College, 14-17 September 2021)

Last update: 16-09-2021

  • Homepage
  • Supernova cosmology: data and simulations (pre-workshop exercise) notebook and inference with MCMC and HMC notebook
  • The 1919 Eclipse: parameter inference and model comparison notebook


Bayesian statistics and information theory

(Imperial College, 2019)

Last update: 28-05-2019

Resources

Programme

Lecture 1: Tuesday 14 May 2019

  • Aspects of probability theory slides
    ... a.k.a. why am I not allowed to "change the prior" or to "cut the data"?
    • Probability theory and Bayesian statistics: reminders
    • Ignorance priors notes, notebook and the maximum entropy principle notebook
    • Gaussian random fields (and a digression on non-Gaussianity) notes, notebook
    • Bayesian signal processing and reconstruction: de-noising notebook 1, notebook 2, de-blending notebook
    • Bayesian decision theory notes, notebook and Bayesian experimental design

Lecture 2: Tuesday 21 May 2019

  • Aspects of probability theory slides
    ... a.k.a. why am I not allowed to "change the prior" or to "cut the data"?
    • Bayesian networks, Bayesian hierarchical models and Empirical Bayes

  • Probabilistic computations slides
    ... a.k.a. how much do I know about the likelihood?
    • Which inference method to choose?
    • Monte-Carlo integration, importance sampling, rejection sampling notebook
    • Markov Chain Monte Carlo: Metropolis-Hastings algorithm & Gelman-Rubin test notebook
    • The test pdf notes
    • Slice sampling notebook, Gibbs sampling notebook
    • Hamiltonian sampling notebook
    • Approximate Bayesian Computation: Likelihood-free rejection sampling notebook

Lecture 3: Tuesday 28 May 2019

  • Aspects of probability theory: Bayesian model comparison notes
    ... a.k.a. why am I not allowed to "change the prior" or to "cut the data"?
    • Nested models and the Savage-Dickey density ratio
    • Bayesian model selection as a decision analysis
    • Bayesian model averaging
    • (Dangers of) model selection with insufficient summary statistics
  • Information theory slides
    ... a.k.a. how much is there to be learned in my data anyway?
    • The noisy binary symmetric channel notebook
    • Low-density parity check codes
    • Measures of entropy and information
    • Information-theoretic experimental design
    • Supervised machine learning basics notebook

Bibliography

  • E. T. Jaynes, Probability Theory: The Logic of Science, edited by G. L. Bretthorst (Cambridge University Press, 2003).
  • A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, D. B. Rubin, Bayesian Data Analysis, Third Edition (Taylor & Francis, 2013).
  • B. D. Wandelt, Astrostatistical Challenges for the New Astronomy (Springer, 2013) Chap. Gaussian Random Fields in Cosmostatistics, pp. 87–105.
  • R. M. Neal, Handbook of Markov Chain Monte Carlo (Chapman & Hall/CRC, 2011) Chap. MCMC Using Hamiltonian Dynamics, pp. 113–162.
  • D. J. C. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge University Press, 2003).
  • G. E. Crooks, On Measures of Entropy and Information (Tech Note, 2016).


Cosmology with Bayesian statistics and information theory

(ICG Portsmouth, 2017)

Last update: 10-03-2017

Resources

  • Preliminary reading: Chapter 3 (except 3.4.) in my PhD thesis.
  • GitHub repository containing the Jupyter notebooks.
  • Programme

    Lecture 1: Monday 6 March 2017

    • Aspects of probability theory slides
      ... a.k.a. why am I not allowed to "change the prior" or to "cut the data"?
      • Ignorance priors and the maximum entropy principle notebook
      • Bayesian signal processing and reconstruction notebook: de-noising notebook 1, notebook 2, de-blending notebook
      • Bayesian decision theory notebook
      • Hypothesis testing beyond the Bayes factor
      • Bayesian networks, Bayesian hierarchical models and Empirical Bayes method

    Lecture 2: Wednesday 8 March 2017

    • Probabilistic computations slides
      ... a.k.a. how much do I know about the likelihood?
      • Which inference method to choose?
      • Monte-Carlo integration, importance sampling, rejection sampling notebook
      • Markov Chain Monte Carlo: Metropolis-Hastings algorithm & Gelman-Rubin test notebook
      • Slice sampling notebook, Gibbs sampling notebook
      • Hamiltonian sampling notebook
      • Likelihood-free methods and Approximate Bayesian Computation notebook

    Lecture 3: Friday 10 March 2017

    • Information theory slides
      ... a.k.a. how much is there to be learned in my data anyway?
      • The noisy binary symmetric channel notebook
      • Low-density parity check codes
      • Measures of entropy and information
      • Information-theoretic experimental design
      • Machine learning basics notebook

    Bibliography

    • E. T. Jaynes, Probability Theory: The Logic of Science, edited by G. L. Bretthorst (Cambridge University Press, 2003).
    • A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, D. B. Rubin, Bayesian Data Analysis, Third Edition (Taylor & Francis, 2013).
    • B. D. Wandelt, Astrostatistical Challenges for the New Astronomy (Springer, 2013) Chap. Gaussian Random Fields in Cosmostatistics, pp. 87–105.
    • R. M. Neal, Handbook of Markov Chain Monte Carlo (Chapman & Hall/CRC, 2011) Chap. MCMC Using Hamiltonian Dynamics, pp. 113–162.
    • D. J. C. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge University Press, 2003).
    • G. E. Crooks, On Measures of Entropy and Information (Tech Note, 2016).