How to use zai-org/WebGLM with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("zai-org/WebGLM", trust_remote_code=True) model = AutoModelForSeq2SeqLM.from_pretrained("zai-org/WebGLM", trust_remote_code=True)
39bff6f
1
2
3
4
5
6
7
8
9
10
{ "<|startofpiece|>": 50257, "<|endofpiece|>": 50258, "[CLS]": 50259, "[MASK]": 50260, "[SEP]": 50261, "[UNUSED]": 50262, "[gMASK]": 50263, "[sMASK]": 50264 }