MoERL was sparked by the brilliant work of the Unsloth project, developed by Daniel Han-Chen & the Unsloth team 🦥✨
Their innovative approach to LLM optimization and commitment to open-source gave us a huge boost of inspiration 💡
While MoERL explores a different path—focusing on Mixture of Experts (MoE) and reinforcement learning—we share the same passion for efficient, accessible, and extensible LLMs 🎯🔧
- Repository: 🦥 Unsloth GitHub
- Creator: Daniel Han-Chen & Unsloth Team
- License: Apache License, Version 2.0
We’re deeply thankful for the open-source ecosystem that empowers us to learn, remix, and build better tools—together 💞🚀
MoERL stands on the shoulders of open-source giants 🦾. We extend our heartfelt thanks to the incredible communities and tools that have laid the foundation for our work in the field of large language models and reinforcement learning:
- 🤗 Transformers: For providing a powerful and flexible ecosystem for working with state-of-the-art NLP models.
BitsAndBytes: For enabling memory-efficient quantization and optimization.
- 🤗 PEFT (Parameter-Efficient Fine-Tuning): For making parameter-efficient techniques easy to integrate.
- 🤗 TRL (Transformer Reinforcement Learning): For providing reinforcement learning tools tailored to language models.
vLLM: For enabling fast and efficient LLM inference.
Your amazing work helps the entire ecosystem go further, faster, and more fun 💪🌍
We’re excited to contribute back and co-create the future of LLMs together!
MoERL is licensed under the Apache License, Version 2.0. However, this project depends on the following third-party libraries that are licensed under the GNU Lesser General Public License (LGPL) Version 3.0:
- Library Name:
unsloth-zoo
- License: LGPL-3.0-or-later
- Source Repository: https://github.com/unslothai/unsloth-zoo
- Usage in MoERL: Dynamically imported at runtime (no source code copied).
MoERL uses this library only through dynamic runtime imports and does not contain any part of the LGPLv3-licensed code. According to the terms of LGPLv3, this form of use does not constitute a derivative work, and therefore does not require MoERL itself to adopt the LGPLv3 license.
So MoERL happily stays Apache License 2.0 while still interoperating with awesome LGPL tools 🧡
Users of this software are hereby informed of the existence of these LGPLv3 components.
If you want to inspect or use unsloth-zoo
, you can find its source code at the link above 🔗
For questions, licensing love letters 💌, or clarifications, feel free to contact the MoERL maintainers 😄