Skip to content

MAGICS-LAB/algo_emu

Repository files navigation

In‑Context Algorithm Emulation in Fixed‑Weight Transformers

This is the code for the paper: In-Context Algorithm Emulation in Fixed-Weight Transformers. You can use this repo to reproduce the results in the paper.

Environmental Setup

  1. Clone the repository
    gh repo clone MAGICS-LAB/algo_emu
    cd algo_emu
  2. Create and activate a virtual environment
    conda create -n algo_emu python=3.10
    conda activate algo_emu
  3. Install required packages
    pip install -r requirements.txt

Usage

Attention Emulating Continuous Function (Theorem 2.1)

Run att_sim_f.py.

Attention Emulating Attention Head (Theorem 3.1/3.2)

Run attn_sim_attn.py.

Attention Emulating Statistical Models (Corollary 3.2.1)

Run att_sim_statistical.py using synthetic data, and att_sim_statistical_ames_data.py using Ames Housing Data.

Handcrafted Frozen Attention Emulating Attention Head with No Training (Theorem 4.2)

Run att_sim_attn_frozen.py.

python att_sim_attn_frozen.py \
  --tune \
  --n 12 --P 66 \
  --tune_d_list 4,8,12,16 \
  --tune_beta_list 2.0,4.0,8.0,16.0 \
  --tune_samples_list 2,4,8,16 \
  --tune_out_csv tune_d_beta_samples.csv

python att_sim_attn_frozen.py \
  --plotP \
  --n 12 --d 4 --beta 2.0 --samples 4 \
  --P_list 20,30,40,60, 80 \
  --plotP_pdf mse_vs_P.pdf \
  --plotP_csv mse_vs_P.csv

python att_sim_attn_frozen.py \
  --plotN \
  --P 100 --d 4 --beta 8.0 --samples 4 \
  --n_list 1,2,4,6,8,12 \
  --plotN_pdf mse_vs_n.png \
  --plotN_csv mse_vs_n.csv

Citation

If you find our work useful, please cite:

@inproceedings{
  hu2026incontext,
  title={In-Context Algorithm Emulation in Fixed-Weight Transformers},
  author={Jerry Yao-Chieh Hu and Hude Liu and Jennifer Yuntong Zhang and Han Liu},
  booktitle={The Fourteenth International Conference on Learning Representations},
  year={2026},
  url={https://openreview.net/forum?id=BC7YLA0zJ0}
}

Releases

No releases published

Packages

 
 
 

Contributors

Languages