You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
class EmbeddingLayer :
...
if len(sparse_emb) > 0:
sparse_exists = True
sparse_emb = torch.cat(sparse_emb, dim=1) # (batch_size, num_features, embed_dim)
# Note: if the emb_dim of sparse features is different, we must squeeze_dim
if squeeze_dim:
if dense_exists and not sparse_exists: # only input dense features
return dense_values
elif not dense_exists and sparse_exists:
return sparse_emb.flatten(start_dim=1) # squeeze dim to : (batch_size, num_features*embed_dim)
如果sparse_emb = torch.cat(sparse_emb, dim=1) # (batch_size, num_features, embed_dim)能够正确执行,那么说明所有Embedding向量维度是相同的,为这么这里需要特别说明# Note: if the emb_dim of sparse features is different, we must squeeze_dim,作者团队是怎么设计的-
The text was updated successfully, but these errors were encountered:
class EmbeddingLayer :
...
if len(sparse_emb) > 0:
sparse_exists = True
sparse_emb = torch.cat(sparse_emb, dim=1) # (batch_size, num_features, embed_dim)
如果sparse_emb = torch.cat(sparse_emb, dim=1) # (batch_size, num_features, embed_dim)能够正确执行,那么说明所有Embedding向量维度是相同的,为这么这里需要特别说明# Note: if the emb_dim of sparse features is different, we must squeeze_dim,作者团队是怎么设计的-
The text was updated successfully, but these errors were encountered: