Optimizing inference proxy for LLMs
-
Updated
Jun 17, 2025 - Python
10000
Optimizing inference proxy for LLMs
A comprehensive toolkit for detecting potential hallucinations in LLM responses. Compatible with any LLM API (OpenAI, Anthropic, local models, etc.)
Easy-to-use LLM API from state-of-the-art providers and comparison
This repository contains a PDF Summary Web Application hosted on Streamlit Cloud.
Add a description, image, and links to the llmapi topic page so that developers can more easily learn about it.
To associate your repository with the llmapi topic, visit your repo's landing page and select "manage topics."