8000 GitHub - Godz-z/DCFT: Deconvolution Fine-Tuning
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Godz-z/DCFT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

49 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adapating DeBERTaV3 with DCFT (Coling 2025)

The folder contains the implementation of DCFT in DeBERTaV3 using the updated package of loralib, which contains the implementation of DCFT. Our code is baesd on AdaLoRA-Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning (ICLR 2023).

Setup Environment

Create and activate the conda env

conda create -n NLU python=3.7
conda activate NLU 

Install Pytorch

pip install torch==1.9.1+cu111 torchvision==0.10.1+cu111 torchaudio==0.9.1 -f https://download.pytorch.org/whl/torch_stable.html

Install the pre-requisites

Install dependencies:

pip install -r requirements.txt

Install transformers: (here we fork NLU examples from microsoft/LoRA and build our examples based on their transformers version, which is v4.4.2.)

pip install -e . 

Install the updated loralib:

pip install -e ../loralib/

Citation

@inproceedings{zhang-etal-2025-parameter,
    title = "Parameter-Efficient Fine-Tuning of Large Language Models via Deconvolution in Subspace",
    author = "Zhang, Jia-Chen  and
      Xiong, Yu-Jie  and
      Xia, Chun-Ming  and
      Zhu, Dong-Hai  and
      Qiu, Xi-He",
    booktitle = "Proceedings of the 31st International Conference on Computational Linguistics",
    year = "2025",
    address = "Abu Dhabi, UAE",
    url = "https://aclanthology.org/2025.coling-main.265/",
    pages = "3924--3935",
}

About

Deconvolution Fine-Tuning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

0