This is the code for the paper: In-Context Algorithm Emulation in Fixed-Weight Transformers. You can use this repo to reproduce the results in the paper.
- Clone the repository
gh repo clone MAGICS-LAB/algo_emu cd algo_emu - Create and activate a virtual environment
conda create -n algo_emu python=3.10 conda activate algo_emu
- Install required packages
pip install -r requirements.txt
Run att_sim_f.py.
Run attn_sim_attn.py.
Run att_sim_statistical.py using synthetic data, and att_sim_statistical_ames_data.py using Ames Housing Data.
Run att_sim_attn_frozen.py.
python att_sim_attn_frozen.py \
--tune \
--n 12 --P 66 \
--tune_d_list 4,8,12,16 \
--tune_beta_list 2.0,4.0,8.0,16.0 \
--tune_samples_list 2,4,8,16 \
--tune_out_csv tune_d_beta_samples.csv
python att_sim_attn_frozen.py \
--plotP \
--n 12 --d 4 --beta 2.0 --samples 4 \
--P_list 20,30,40,60, 80 \
--plotP_pdf mse_vs_P.pdf \
--plotP_csv mse_vs_P.csv
python att_sim_attn_frozen.py \
--plotN \
--P 100 --d 4 --beta 8.0 --samples 4 \
--n_list 1,2,4,6,8,12 \
--plotN_pdf mse_vs_n.png \
--plotN_csv mse_vs_n.csvIf you find our work useful, please cite:
@inproceedings{
hu2026incontext,
title={In-Context Algorithm Emulation in Fixed-Weight Transformers},
author={Jerry Yao-Chieh Hu and Hude Liu and Jennifer Yuntong Zhang and Han Liu},
booktitle={The Fourteenth International Conference on Learning Representations},
year={2026},
url={https://openreview.net/forum?id=BC7YLA0zJ0}
}