You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File "/qiuwkai27/cx/baby-llama2-chinese/sft.py", line 274, in
tokenizer=ChatGLMTokenizer(vocab_file='./chatglm_tokenizer/tokenizer.model')
File "/qiuwkai27/cx/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 68, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs)
File "/root/miniconda3/envs/cxx/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 436, in init
self._add_tokens(
File "/root/miniconda3/envs/cxx/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 544, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/qiuwkai27/cx/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 110, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
File "/qiuwkai27/cx/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 106, in vocab_size
return self.tokenizer.n_words
AttributeError: 'ChatGLMTokenizer' object has no attribute 'tokenizer'. Did you mean: 'tokenize'?
我看文件定义了啊,为什么还是报这种错误
The text was updated successfully, but these errors were encountered:
File "/qiuwkai27/cx/baby-llama2-chinese/sft.py", line 274, in
tokenizer=ChatGLMTokenizer(vocab_file='./chatglm_tokenizer/tokenizer.model')
File "/qiuwkai27/cx/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 68, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs)
File "/root/miniconda3/envs/cxx/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 436, in init
self._add_tokens(
File "/root/miniconda3/envs/cxx/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 544, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/qiuwkai27/cx/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 110, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
File "/qiuwkai27/cx/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 106, in vocab_size
return self.tokenizer.n_words
AttributeError: 'ChatGLMTokenizer' object has no attribute 'tokenizer'. Did you mean: 'tokenize'?
我看文件定义了啊,为什么还是报这种错误
The text was updated successfully, but these errors were encountered: