-
Notifications
You must be signed in to change notification settings - Fork 27.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deleting quantization_config broken #35223
Comments
I think this might be a question for the forums/Discord, but pinging @SunMarc @MekkCyber just in case |
Hey @psinger, what's the reason you want to delete the quantization_config after the model is loaded ? |
Because I am manually dequantizing and am not relying on the HF functionality for it. And then before pushing to hub, I want to remove the quantization_config from the config. |
Thanks for the clarification @psinger, I think you can make it work as follows :
|
Does it fix the issue @psinger ? |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
System Info
Version 4.47.0
Reproduction
TypeError: Object of type dtype is not JSON serializable
Expected behavior
I am unable to delete the
quantization_config
from an existing model. Whenever I do it, it just completely breaks the whole config.I also tried setting
is_quantized=False
but it does not change anything.Is there another way of achieving this?
I am aware that there is a
.dequantize
function, but in this case Im changing dtypes on my own and want to exclude thatquantization_config
particularly when saving the model.The text was updated successfully, but these errors were encountered: