Skip to content
This repository has been archived by the owner on Jan 21, 2025. It is now read-only.

bias in selfAttention #253

Open
wintersurvival opened this issue Dec 3, 2020 · 0 comments
Open

bias in selfAttention #253

wintersurvival opened this issue Dec 3, 2020 · 0 comments

Comments

@wintersurvival
Copy link

when running transformer, bias is not existed in selfAttention. mesh_tensorflow/bert has bias in selfAttention.
what's the meaning of relative_attention_type transformer_layer.SelfAttention?
how could I get the bias in transformer_layer.SelfAttention?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant