Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explanation for Soft Embedding parameter "n_prompts" #2

Open
mallorbc opened this issue Feb 22, 2023 · 2 comments
Open

Explanation for Soft Embedding parameter "n_prompts" #2

mallorbc opened this issue Feb 22, 2023 · 2 comments

Comments

@mallorbc
Copy link

mallorbc commented Feb 22, 2023

As the README states, the soft embedding code for prompt tuning comes from the repo here

However there are a few key changes, most notably the new parameter "n_prompts". Could you please explain what this is and how it is used?

I had a few guesses. Is it there to allow batching? And if so, must we always use the same batch size after we train it?

@mallorbc mallorbc changed the title Explanation for Soft Embedding parameter Explanation for Soft Embedding parameter "n_prompts" Feb 22, 2023
@exelents
Copy link
Owner

In this model you can have several prompts to perform different tasks.
When you train or generate text with prompt you pass to the model a special tokens sequence, which consists from 2 parts. The important is the first part is a dummy tokens for prompt, with length of prompt size. It is been not passed to the embedding matrix - instead of it it is been replaced by needed prompt.

But as you have seen, in this model yo can have more than one prompts (or prefixes, which is more correct):
https://github.com/exelents/soft-prompt-tuning/blob/main/soft_embedding.py#L49

So, to train or use necessary prefix you should pass it's index to the first token of a sequence, I described.
For N prefixes (prompts) index is between [0, N-1]
Here the index of prefix for each item in batch transformed into it's specific trainable parameter:
https://github.com/exelents/soft-prompt-tuning/blob/main/soft_embedding.py#L65

@exelents
Copy link
Owner

@mallorbc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants