Stars
Python implementations of contextual bandits algorithms
[IJAIT 2021] MABWiser: Contextual Multi-Armed Bandits Library
running bandit algorithms on piecewise stationary bandit instances
🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-play…