-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Open
Description
long story short
want to use the bitnet model with quant type tl2
the default gguf model on hf is i2_s
converting the model to tl2 myself throws the following error (this after I changed the BitnetForCausalLM from n to capital N ofc)
python3 ./utils/generate-dummy-bitnet-model.py --outtype tl2 --outfile ./ggml-model-tl2.gguf path/to/model/bitnet-b1.58-2B-4T
Traceback (most recent call last):
File "/home/shubh/BitNet/./utils/generate-dummy-bitnet-model.py", line 1048, in <module>
main()
File "/home/shubh/BitNet/./utils/generate-dummy-bitnet-model.py", line 979, in main
model_instance.set_vocab()
File "/home/shubh/BitNet/./utils/generate-dummy-bitnet-model.py", line 794, in set_vocab
self._set_vocab_sentencepiece()
File "/home/shubh/BitNet/./utils/generate-dummy-bitnet-model.py", line 441, in _set_vocab_sentencepiece
raise FileNotFoundError(f"File not found: {tokenizer_path}")
FileNotFoundError: File not found: /home/shubh/BitNet/HFmodel/bitnet-b1.58-2B-4T/tokenizer.model
looking at the code the sentencepiece requires tokenizer.model not tokenizer.json
def _set_vocab_sentencepiece(self):
from sentencepiece import SentencePieceProcessor
tokenizer_path = self.dir_model / 'tokenizer.model'
Want to know how can I make this work
Is the model directly available somewhere, or is there a better to to generate tl2?
Metadata
Metadata
Assignees
Labels
No labels