Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about Embedding #15

Open
Fyhyuky-FONTA opened this issue Jul 13, 2024 · 0 comments
Open

about Embedding #15

Fyhyuky-FONTA opened this issue Jul 13, 2024 · 0 comments

Comments

@Fyhyuky-FONTA
Copy link

class EmbeddingLayer :
...
if len(sparse_emb) > 0:
sparse_exists = True
sparse_emb = torch.cat(sparse_emb, dim=1) # (batch_size, num_features, embed_dim)

    # Note: if the emb_dim of sparse features is different, we must squeeze_dim
    if squeeze_dim:  
        if dense_exists and not sparse_exists:  # only input dense features
            return dense_values
        elif not dense_exists and sparse_exists:
            return sparse_emb.flatten(start_dim=1)  # squeeze dim to : (batch_size, num_features*embed_dim)

如果sparse_emb = torch.cat(sparse_emb, dim=1) # (batch_size, num_features, embed_dim)能够正确执行,那么说明所有Embedding向量维度是相同的,为这么这里需要特别说明# Note: if the emb_dim of sparse features is different, we must squeeze_dim,作者团队是怎么设计的-

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant