Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: HuggingFace Gated Model Support #1015

Closed
EricXQiu opened this issue Nov 19, 2024 · 3 comments
Closed

feature: HuggingFace Gated Model Support #1015

EricXQiu opened this issue Nov 19, 2024 · 3 comments
Labels
architecture Architectural upgrades

Comments

@EricXQiu
Copy link

Summary

A support for HuggingFace gated model is needed. A gated model can be a model that needs to accept a license to get access. An example can be mistralai/Mistral-7B-Instruct-v0.2. When downloading the model, the user needs to provide a HF token.

Basic example

Take the mistralai/Mistral-7B-Instruct-v0.2 as an example. Once the user click accept the license. They can download the model through the huggingface_hub or git with HuggingFace acces token.

The model url is https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2

Motivation

Recently there are more gated models than before. This will make garak more helpful to deal with more models.

@EricXQiu EricXQiu added the architecture Architectural upgrades label Nov 19, 2024
@EricXQiu EricXQiu changed the title HuggingFace Gated Model Support feature: HuggingFace Gated Model Support Nov 19, 2024
@jmartin-tech
Copy link
Collaborator

Gated huggingface models can be downloaded if the system has a cached token in place. See huggingface cli login for details. This will cache the token in the user's huggingface XDG cache that the transformers library relies on.

There are other supported methods to provide a token via configuration however I suspect they are less user friendly one such method is adding a token value via entires in hf_args for the generator selected.

To pass via --config as yaml:

plugins:
  generators:
    huggingface:
      Pipeline:
        hf_args:
          token: <hf_token>

To pass via --generator_option_file as json:

{
  "huggingface": {
    "Pipeline": {
      "hf_args": {
        "token": "<hf_token>"
      }
    }
  }
}

@jmartin-tech
Copy link
Collaborator

This is also covering in the FAQ.

@EricXQiu
Copy link
Author

Thanks. I think this issue can be closed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
architecture Architectural upgrades
Projects
None yet
Development

No branches or pull requests

2 participants