Skip to content

Commit

Permalink
docs/src/index.md: Update also docs for RoPE 4D dims.
Browse files Browse the repository at this point in the history
  • Loading branch information
mashu committed Nov 26, 2024
1 parent 6ef1f05 commit 697fcdb
Showing 1 changed file with 4 additions and 6 deletions.
10 changes: 4 additions & 6 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,11 +25,11 @@ Modules = [PositionalEmbeddings]
RoPE encodes positional information by applying rotations to feature vectors:

```julia
# Create RoPE for features of dimension 512 and maximum sequence length of 1024
# Create RoPE for head_dim of dimension 512 and maximum sequence length of 1024
rope = RoPE(512, 1024)

# Apply to any feature tensor of shape (features, sequence_length, batch)
features = randn(Float32, 512, 100, 32)
# Apply to any feature tensor of shape (head_dim, n_heads,sequence_length, batch)
features = randn(Float32, 512, 2, 100, 32)
features_with_pos = rope(features)
```

Expand Down Expand Up @@ -96,9 +96,7 @@ function (rmha::RoPEMultiHeadAttention)(q_in::A3, k_in::A3, v_in::A3, bias=nothi
k = mha.k_proj(k_in)
v = mha.v_proj(v_in)

# Apply RoPE
q = rmha.rope(q)
k = rmha.rope(k)
# Apply RoPE to Q and K (TODO)

# Compute attention
x, α = NNlib.dot_product_attention(q, k, v, bias;
Expand Down

0 comments on commit 697fcdb

Please sign in to comment.