Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The following model_kwargs are not used by the model: use_gpu #27

Open
erayerdin opened this issue Oct 16, 2022 · 10 comments
Open

The following model_kwargs are not used by the model: use_gpu #27

erayerdin opened this issue Oct 16, 2022 · 10 comments

Comments

@erayerdin
Copy link

A fresh installation shows this error:

The following `model_kwargs` are not used by the model: ['use_gpu'] (note: typos in the generate arguments will also show up in this list)
@alexander-minchin
Copy link

I'm also seeing this error :(

@dperlman-ias
Copy link

Same :/

@cjz9032
Copy link

cjz9032 commented Nov 3, 2022

same

2 similar comments
@GitHdu
Copy link

GitHdu commented Nov 16, 2022

same

@alexandermontillarivera

same

@ppsantiago
Copy link

Same problem

@reshinthadithyan
Copy link

reshinthadithyan commented Nov 19, 2022

Thanks for the patience, the team was busy with other stuffs. I just sent a PR at #28. This would ideally fix the issue. I'll ping once the PR is merged.

ncoop57 pushed a commit that referenced this issue Nov 19, 2022
Fix `use_gpu` bug as reported in  #27
@GitHdu
Copy link

GitHdu commented Nov 22, 2022

Thanks for the patience, the team was busy with other stuffs. I just sent a PR at #28. This would ideally fix the issue. I'll ping once the PR is merged.

it leads to a new issue #26

@reshinthadithyan
Copy link

reshinthadithyan commented Nov 22, 2022

@GitHdu, can you paste the error stack when encountering this issue or help us reproduce the bug?
I tested it in a small file might be token length wasn't an issue there. Okay let me try it in a big file with huge context length.
Thanks.

@GitHdu
Copy link

GitHdu commented Nov 23, 2022

@GitHdu, can you paste the error stack when encountering this issue or help us reproduce the bug? I tested it in a small file might be token length wasn't an issue there. Okay let me try it in a big file with huge context length. Thanks.

sorry i do not know how to paste the error stack, there is only an error tip when input the code
Error: Input is too long for this model, shorten your input or use 'parameters': {'truncation': 'only_first'} to run the model only on the first part.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants