8000 GitHub - XMUDeepLIT/TTS_COMT: Code for "Investigating Inference-time Scaling for Chain of Multi-modal Thought: A Preliminary Study" (ACL 2025 Findings)
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Code for "Investigating Inference-time Scaling for Chain of Multi-modal Thought: A Preliminary Study" (ACL 2025 Findings)

Notifications You must be signed in to change notification settings

XMUDeepLIT/TTS_COMT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

README

Code for "Investigating Inference-time Scaling for Chain of Multi-modal Thought: A Preliminary Study" (ACL 2025 Findings)

Quick Start

  1. Install Environment:
conda env create -f environment.yml
  1. Execute Example:
export API_KEY="YOUR_API_KEY"
export BASE_URL="YOUR_BASE_URL"

pipeline=cot
model_type=openai
model=gpt-4o-mini
task=graph_maxflow

python main.py \
    --pipeline $pipeline \
    --model_type $model_type \
    --model $model \
    --task $task

Other specific args can be found in main.py, mmo/pipeline/__init__.py, mmo/task/__init__.py

Acknowledgement

Thanks to the following repos for their great works:

Citation

@article{lin2025investigating,
  title={Investigating inference-time scaling for chain of multi-modal thought: A preliminary study},
  author={Lin, Yujie and Wang, Ante and Chen, Moye and Liu, Jingyao and Liu, Hao and Su, Jinsong and Xiao, Xinyan},
  journal={arXiv preprint arXiv:2502.11514},
  year={2025}
}

About

Code for "Investigating Inference-time Scaling for Chain of Multi-modal Thought: A Preliminary Study" (ACL 2025 Findings)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0