Plugin for LLM providing access to Grok models using the xAI API
Install this plugin in the same environment as LLM:
llm install llm-grok
First, obtain an API key from xAI.
Configure the key using the llm keys set grok
command:
llm keys set grok
# Paste your xAI API key here
You can also set it via environment variable:
export XAI_API_KEY="your-api-key-here"
You can now access the Grok model. Run llm models
to see it in the list.
To run a prompt through grok-3-latest
(default model):
llm -m grok-3-latest 'What is the meaning of life, the universe, and everything?'
To start an interactive chat session:
llm chat -m grok-3-latest
Example chat session:
Chatting with grok-3-latest
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> Tell me a joke about programming
To use a system prompt to give Grok specific instructions:
cat example.py | llm -m grok-3-latest -s 'explain this code in a humorous way'
To set your default model:
llm models default grok-3-mini-latest
# Now running `llm ...` will use `grok-3-mini-latest` by default
The following Grok models are available:
grok-3-latest
(default)grok-3-mini-fast-latest
grok-3-mini-latest
grok-3-fast-latest
grok-2-latest
grok-2-vision-latest
You can check the available models using:
llm grok models
The grok-3-latest model accepts the following options, using -o name value
syntax:
-o temperature 0.7
: The sampling temperature, between 0 and 1. Higher values like 0.8 increase randomness, while lower values like 0.2 make the output more focused and deterministic.-o max_completion_tokens 100
: Maximum number of tokens to generate in the completion (includes both visible tokens and reasoning tokens).
Example with options:
llm -m grok-3-latest -o temperature 0.2 -o max_completion_tokens 50 'Write a haiku about AI'
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
git clone https://github.com/hiepler/llm-grok.git
cd llm-grok
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
pip install -e '.[test]'
To run the tests:
pytest
List available Grok models:
llm grok models
This plugin uses the xAI API. For more information about the API, see:
Contributions are welcome! Please feel free to submit a Pull Request.
Apache License 2.0