- Python 3.11
- Conda (recommended)
- Clone the repository:
git clone https://github.com/RAKG/RAKG.git
cd RAKG
- Create and activate a conda environment:
conda create -n RAKG python=3.11
conda activate RAKG
- Install dependencies:
pip install -r requirements.txt
Edit src/config.py
to configure your model provider settings:
- For local Ollama: Set
base_url
tohttp://localhost:11434/v1/
- For server-based Ollama: Set
base_url
tohttp://your_server_ip
Default Ollama model configurations:
- Main model: Qwen2.5-72B, requires good instruction following
- Similarity check model: Qwen2-7B, using smaller model for faster processing
- embedding model: BGE-M3
- Set your OpenAI API key in
OPENAI_API_KEY
- Configure model selection:
- Main model: Qwen2.5-72B-Instruct
- Similarity check model: Qwen2.5-14B-Instruct
- Embedding model: BGE-M3
To switch between providers, set USE_OPENAI = True
for OpenAI or False
for Ollama.
To process text input:
cd examples
python RAKG_example.py --input "your input text" --output result/kg.json --topic "your_topic" --is-text
To process document input:
python RAKG_example.py --input data/MINE.json --output result/kg.json
To reproduce the results from the paper:
cd src/construct
python RAKG.py
cd src/eval/llm_eval
For evaluation purposes, we recommend using the DeepEval platform. Please refer to the DeepEval documentation for setup and usage instructions.
cd src/eval/MINE_eval
python evaluate_MINE_RAKG.py
cd src/eval/ideal_kg_eval
python kg_eval.py
We welcome contributions! Please read our contributing guidelines before submitting pull requests.
This repo benefits from:
Thanks for these wonderful works.
For any questions or feedback, please:
- Open an issue in the GitHub repository
- Reach out to us at 2212855@mail.nankai.edu.cn