Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is atten_weights_ph? #3

Open
ngocanh2162 opened this issue Apr 22, 2022 · 1 comment
Open

What is atten_weights_ph? #3

ngocanh2162 opened this issue Apr 22, 2022 · 1 comment

Comments

@ngocanh2162
Copy link

ngocanh2162 commented Apr 22, 2022

In file modules/attention.py line 434-435

        if atten_weights_ph is not None:    # used for emotional gst tts inference
            atten_weights = atten_weights_ph

When I run inference, it stucks at this tensor. I cannot find any refer to this

@caixxiong
Copy link
Collaborator

caixxiong commented Mar 17, 2023

Thank you for your quesstion.

When training, the attention weights of gst tokens is computed by the prosody embedding of the reference utterance(i.e. the input utterance).

When synthesizing, the attention weights is passed by this argument: atten_weights_ph(means attention weigths placeholder), which is computed offline by averaging attention weights of the top-K utterances of each emotion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants