10000 GitHub - tlmakinen/ltu-ili: A Python framework for robust, ML-enabled statistical inference in astronomical applications.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
forked from maho3/ltu-ili

A Python framework for robust, ML-enabled statistical inference in astronomical applications.

Notifications You must be signed in to change notification settings

tlmakinen/ltu-ili

 
 

Repository files navigation

LtU-ILI

All Contributors unittest codecov docs

The Learning the Universe Implicit Likelihood Inference (LtU-ILI) pipeline is a framework for performing robust, ML-enabled statistical inference for astronomical applications. The pipeline supports and implements the various methodologies of implicit likelihood inference (also called simulation-based inference or likelihood-free inference), i.e. the practice of learning to represent Bayesian posteriors using neural networks trained on simulations (see this paper for a review).

The major design principles of LtU-ILI are accessiblility, modularity, and generalizablity. For any training set of data-parameter pairs (including those with image- or graph-like inputs), one can use state-of-the-art methods to build neural networks to construct tight, well-calibrated Bayesian posteriors on unobserved parameters with well-calibrated uncertainty quantification. The pipeline is quick and easy to set up; here's an example of training a Masked Autoregressive Flow (MAF) network to predict a univariate posterior:

...  # Imports

X, Y = load_data()                              # Load training data and parameters
loader = ili.data.NumpyLoader(X, Y)             # Create a data loader

trainer = ili.inference.SBIRunner(
  prior = sbi.utils.BoxUniform(low=-1, high=1), # Define a prior 
  inference_class = sbi.inference.SNPE,         # Choose an inference method
  nets = [sbi.utils.posterior_nn(model='maf')]  # Define a neural network architecture
)

posterior, _ = trainer(loader)                  # Run training to map data -> parameters

samples = posterior.sample(x[0], (1000,))       # Generate 1000 samples from the posterior

Beyond this simple example, LtU-ILI comes with a wide range of customizable complexity, including:

  • Posterior-, Likelihood-, and Ratio-Estimation methods for ILI
  • Diversity of neural density estimators (Mixture Density Networks, ResNet-like ratio classifiers, various Conditional Normalizing Flows)
  • Fully-customizable information embedding networks
  • A unified interface for multiple ILI backends (sbi, pydelfi)
  • Various marginal and multivariate posterior coverage metrics
  • Jupyter and command line interfaces
  • A parallelizable configuration framework for efficient hyperparameter tuning and production runs

For more details on the motivation, design, and theoretical background of this project, see the software release paper (arxiv:XXXXX).

Getting Started

To install LtU-ILI, follow the instructions in INSTALL.md.

To get started, try out the tutorial for the Jupyter notebook interface in notebooks/tutorial.ipynb or the command line interface in examples/.

API Documentation

The documentation for this project can be found at this link.

References

We keep an updated repository of relevant interesting papers and resources at this link.

Contributing

Before contributing, please familiarize yourself with the contribution workflow described in CONTRIBUTING.md.

Contact

If you have comments, questions, or feedback, please write us an issue. The current leads of the Learning the Universe ILI working group are Benjamin Wandelt (benwandelt@gmail.com) and Matthew Ho (matthew.annam.ho@gmail.com).

Contributors

Matt Ho
Matt Ho

💻 🎨 💡 📖 👀 🚇 🖋 🔬
Carolina Cuesta
Carolina Cuesta

💻 🎨 💡 📖 👀 🔬
Deaglan Bartlett
Deaglan Bartlett

💻 🎨 💡 📖 👀 🚇 🖋 🔬
Nicolas Chartier
Nicolas Chartier

💡 📖 🔬 💻 🎨 👀 🖋
Simon
Simon

💻 💡
Pablo Lemos
Pablo Lemos

🎨 💻
Chirag Modi
Chirag Modi

🎨 💻
Axel Lapel
Axel Lapel

💻 🔬 💡
Chris Lovell
Chris Lovell

🔬 💡 🔣 🖋
Shivam Pandey
Shivam Pandey

🔬 💡
L.A. Perez
L.A. Perez

🔬 🖋
T. Lucas Makinen
T. Lucas Makinen

💻 🔬

Acknowledgements

This work is supported by the Simons Foundation through the Simons Collaboration on Learning the Universe.

About

A Python framework for robust, ML-enabled statistical inference in astronomical applications.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 95.0%
  • Python 5.0%
0