A lightweight Go client library for interacting with the Mistral AI API. This library provides easy access to Mistral's powerful language models and AI capabilities.
- Support for chat completions
- Function calling capabilities
- Vision model support
- Classification/moderation endpoints
- Agent completions
go get github.com/coalaura/mistral
package main
import (
"fmt"
"github.com/coalaura/mistral"
)
func main() {
// Initialize client with your API key
client := mistral.NewMistralClient("your-api-key")
// Create a simple chat request
request := mistral.ChatCompletionRequest{
Model: mistral.ModelMistralSmall,
Messages: []mistral.Message{
{
Role: mistral.RoleUser,
Content: "Hello, how are you?",
},
},
}
// Send the chat request
response, err := client.Chat(request)
if err != nil {
panic(err)
}
fmt.Println(response.Choices[0].Message.Content)
}
See the Mistral Docs for a list of available models.
Send chat messages and receive AI-generated responses.
Use Mistral's moderation capabilities to classify content.
Implement function calling with custom tools and parameters.
Interact with Mistral's agent-based completions.
Contributions are welcome, however this is a side project and I may not be able to respond to issues or pull requests in a timely manner.
This project is licensed under the Mozilla Public License 2.0. See the LICENSE file for details.