The Learning the Universe Implicit Likelihood Inference (LtU-ILI) pipeline is a framework for performing robust, ML-enabled statistical inference for astronomical applications. The pipeline supports and implements the various methodologies of implicit likelihood inference (also called simulation-based inference or likelihood-free inference), i.e. the practice of learning to represent Bayesian posteriors using neural networks trained on simulations (see this paper for a review).
The major design principles of LtU-ILI are accessiblility, modularity, and generalizablity. For any training set of data-parameter pairs (including those with image- or graph-like inputs), one can use state-of-the-art methods to build neural networks to construct tight, well-calibrated Bayesian posteriors on unobserved parameters with well-calibrated uncertainty quantification. The pipeline is quick and easy to set up; here's an example of training a Masked Autoregressive Flow (MAF) network to predict a univariate posterior:
... # Imports
X, Y = load_data() # Load training data and parameters
loader = ili.data.NumpyLoader(X, Y) # Create a data loader
trainer = ili.inference.SBIRunner(
prior = sbi.utils.BoxUniform(low=-1, high=1), # Define a prior
inference_class = sbi.inference.SNPE, # Choose an inference method
nets = [sbi.utils.posterior_nn(model='maf')] # Define a neural network architecture
)
posterior, _ = trainer(loader) # Run training to map data -> parameters
samples = posterior.sample(x[0], (1000,)) # Generate 1000 samples from the posterior
Beyond this simple example, LtU-ILI comes with a wide range of customizable complexity, including:
- Posterior-, Likelihood-, and Ratio-Estimation methods for ILI
- Diversity of neural density estimators (Mixture Density Networks, ResNet-like ratio classifiers, various Conditional Normalizing Flows)
- Fully-customizable information embedding networks
- A unified interface for multiple ILI backends (sbi, pydelfi)
- Various marginal and multivariate posterior coverage metrics
- Jupyter and command line interfaces
- A parallelizable configuration framework for efficient hyperparameter tuning and production runs
For more details on the motivation, design, and theoretical background of this project, see the software release paper (arxiv:XXXXX).
To install LtU-ILI, follow the instructions in INSTALL.md.
To get started, try out the tutorial for the Jupyter notebook interface in notebooks/tutorial.ipynb or the command line interface in examples/.
The documentation for this project can be found at this link.
We keep an updated repository of relevant interesting papers and resources at this link.
Before contributing, please familiarize yourself with the contribution workflow described in CONTRIBUTING.md.
If you have comments, questions, or feedback, please write us an issue. The current leads of the Learning the Universe ILI working group are Benjamin Wandelt (benwandelt@gmail.com) and Matthew Ho (matthew.annam.ho@gmail.com).
This work is supported by the Simons Foundation through the Simons Collaboration on Learning the Universe.