PEFT-Factory is a fork of LLaMA-Factory β€οΈ, enhanced with an easy-to-use PEFT interface, support for HuggingFace PEFT methods, and curated datasets for benchmarking PEFT approaches.
π System Demonstration Paper Β |Β π₯ Demo Video Β |Β ποΈ EACL 2026
| PEFT Method | Supported | Backend |
|---|---|---|
| LoRA (including variants) | β | π¦ LLaMA-Factory |
| OFT | β | π¦ LLaMA-Factory |
| Prefix Tuning | β | π€ HuggingFace PEFT |
| Prompt Tuning | β | π€ HuggingFace PEFT |
| P-Tuning | β | π€ HuggingFace PEFT |
| P-Tuning v2 | β | π€ HuggingFace PEFT |
| MPT | β | π€ HuggingFace PEFT |
| IAΒ³ | β | π€ HuggingFace PEFT |
| LNTuning | β | π€ HuggingFace PEFT |
| Bottleneck Adapter | β | π€ AdapterHub |
| Parallel Adapter | β | π€ AdapterHub |
| SeqBottleneck Adapter | β | π€ AdapterHub |
| SVFT | β | βοΈ Custom |
| BitFit | β | βοΈ Custom |
This section provides instructions on how to install PEFT-Factory, download the necessary data and methods, and run training using either the command line or the web UI.
For a video walkthrough, visit the PEFT-Factory Demonstration Video.
# Install the package
pip install peftfactory
# Download the repository, which contains data, PEFT methods, and examples
git clone https://github.com/kinit-sk/PEFT-Factory.git && cd PEFT-Factory
# Start the web UI
pf webuiAlternatively, you can run training directly from the command line:
# Install the package
pip install peftfactory
# Download the repository, which contains data, PEFT methods, and examples
git clone https://github.com/kinit-sk/PEFT-Factory.git && cd PEFT-FactoryDefine the variables that will be substituted into the training config template:
TIMESTAMP=`date +%s`
OUTPUT_DIR="saves/bitfit/llama-3.2-1b-instruct/train_wsc_${TIMESTAMP}"
DATASET="wsc"
SEED=123
WANDB_PROJECT="peft-factory-train-bitfit"
WANDB_NAME="bitfit_llama-3.2-1b-instruct_train_wsc"
mkdir -p "${OUTPUT_DIR}"
export OUTPUT_DIR DATASET SEED WANDB_PROJECT WANDB_NAMEThe envsubst utility replaces occurrences of environment variables in the template file with their current values:
envsubst < examples/peft/bitfit/llama-3.2-1b-instruct/train.yaml > ${OUTPUT_DIR}/train.yamlpeftfactory-cli train ${OUTPUT_DIR}/train.yamlPEFT-Factory can be installed in several ways: directly from PyPI for the latest release, or built from source for the development version.
pip install peftfactory1. Clone the repository:
git clone git@github.com:kinit-sk/PEFT-Factory.git2. Build the wheel package:
make build3. Install with pip:
pip install dist/[name of the built package].whlDeepSpeed is required for evaluation and computation of the PSCP metric.
pip install deepspeedNote: You may encounter an error about the
CUDA_HOMEenvironment variable not being set. The fix depends on your environment:
conda install -c nvidia cuda-compilerYou will need to install CUDA with the nvcc compiler at the OS level. Instructions vary by operating system β consult your distribution's documentation. For example, on Arch Linux:
# Arch Linux example β the exact command differs per OS
sudo pacman -S cudaTo download the datasets, PEFT method implementations, and example configs for training, clone the repository from GitHub:
git clone https://github.com/kinit-sk/PEFT-Factory.git && cd PEFT-Factorypf train [path to config file].yamlpf webuiIf you use PEFT-Factory in your research, please cite our EACL 2026 System Demonstration paper:
@inproceedings{belanec-etal-2026-peft-factory,
title = "{PEFT}-Factory: Unified Parameter-Efficient Fine-Tuning of Autoregressive Large Language Models",
author = "Belanec, Robert and
Srba, Ivan and
Bielikova, Maria",
editor = "Croce, Danilo and
Leidner, Jochen and
Moosavi, Nafise Sadat",
booktitle = "Proceedings of the 19th Conference of the {E}uropean Chapter of the {A}ssociation for {C}omputational {L}inguistics (Volume 3: System Demonstrations)",
month = mar,
year = "2026",
address = "Rabat, Morocco",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2026.eacl-demo.15/",
doi = "10.18653/v1/2026.eacl-demo.15",
pages = "188--202",
ISBN = "979-8-89176-382-1",
abstract = "Parameter-Efficient Fine-Tuning (PEFT) methods address the increasing size of Large Language Models (LLMs). Currently, many newly introduced PEFT methods are challenging to replicate, deploy, or compare with one another. To address this, we introduce PEFT-Factory, a unified framework for efficient fine-tuning LLMs using both off-the-shelf and custom PEFT methods. While its modular design supports extensibility, it natively provides a representative set of 19 PEFT methods, 27 classification and text generation datasets addressing 12 tasks, and both standard and PEFT-specific evaluation metrics. As a result, PEFT-Factory provides a ready-to-use, controlled, and stable environment, improving replicability and benchmarking of PEFT methods. PEFT-Factory is a downstream framework that originates from the popular LLaMA-Factory, and is publicly available at https://github.com/kinit-sk/PEFT-Factory."
}