Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

config_class inconsistency for a custom model #30149

Closed
4 tasks
benderama3 opened this issue Apr 9, 2024 · 5 comments · Fixed by #29854
Closed
4 tasks

config_class inconsistency for a custom model #30149

benderama3 opened this issue Apr 9, 2024 · 5 comments · Fixed by #29854

Comments

@benderama3
Copy link

benderama3 commented Apr 9, 2024

System Info

Default Google colab env
transformers>=4.39.0

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Tested on colab
Works with transformers==4.38.2 but fails with transformers>=4.39.0

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, pipeline

tokenizer = AutoTokenizer.from_pretrained("ccdv/lsg-bart-base-4096-booksum")
model = AutoModelForSeq2SeqLM.from_pretrained("ccdv/lsg-bart-base-4096-booksum", trust_remote_code=True)

text = "Replace by what you want."*512
pipe = pipeline("text2text-generation", model=model, tokenizer=tokenizer)
generated_text = pipe(
  text, 
  truncation=True, 
  max_length=128, 
  no_repeat_ngram_size=7,
  num_beams=2,
  early_stopping=True
  )

Error message:

ValueError                                Traceback (most recent call last)

[<ipython-input-2-af21733e3710>](https://localhost:8080/#) in <cell line: 4>()
      2 
      3 tokenizer = AutoTokenizer.from_pretrained("ccdv/lsg-bart-base-4096-booksum")
----> 4 model = AutoModelForSeq2SeqLM.from_pretrained("ccdv/lsg-bart-base-4096-booksum", trust_remote_code=True)
      5 
      6 text = "Replace by what you want."*512

1 frames

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    555                 model_class.register_for_auto_class(cls.__name__)
    556             else:
--> 557                 cls.register(config.__class__, model_class, exist_ok=True)
    558             return model_class.from_pretrained(
    559                 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs

[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in register(cls, config_class, model_class, exist_ok)
    581         """
    582         if hasattr(model_class, "config_class") and model_class.config_class != config_class:
--> 583             raise ValueError(
    584                 "The model class you are passing has a `config_class` attribute that is not consistent with the "
    585                 f"config class you passed (model has {model_class.config_class} and you passed {config_class}. Fix "

ValueError: The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.ccdv.lsg-bart-base-4096-booksum.e1c052c8cdaf3bd6fc245209e8530c392ea07510.modeling_lsg_bart.LSGBartConfig'> and you passed <class 'transformers_modules.ccdv.lsg-bart-base-4096-booksum.e1c052c8cdaf3bd6fc245209e8530c392ea07510.modeling_lsg_bart.LSGBartConfig'>. Fix one of those so they match!

Both classes are the same.

Expected behavior

No error.

@amyeroberts
Copy link
Collaborator

cc @Rocketknight1

@Rocketknight1
Copy link
Member

Yeah, this is very likely related to the fix here: #29854

@benderama3 can you please check the PR branch there and tell me if it fixes the issue? We'll try to prioritize merging it if so - we've just been swamped with new model releases!

You can install the PR branch with pip install git+https://github.com/huggingface/transformers.git@update_config_class_check

@Rocketknight1
Copy link
Member

@benderama3 this has been merged! You can now get the fix by installing from main with https://github.com/huggingface/transformers.git. Please let me know if this resolves the issue!

@luoxuan-cs
Copy link

still not fixed

@Rocketknight1
Copy link
Member

@luoxuan-cs Can you give us any code to reproduce the issue you're seeing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants