Skip to content

Commit

Permalink
Corrected Attention docstring
Browse files Browse the repository at this point in the history
Corrected Attention docstring missing head dimension for arguments mask and nonbatched_bias.
  • Loading branch information
louis-rf authored Feb 5, 2024
1 parent 632ef57 commit cb98745
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions alphafold/model/modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -558,8 +558,8 @@ def __call__(self, q_data, m_data, mask, nonbatched_bias=None):
q_data: A tensor of queries, shape [batch_size, N_queries, q_channels].
m_data: A tensor of memories from which the keys and values are
projected, shape [batch_size, N_keys, m_channels].
mask: A mask for the attention, shape [batch_size, N_queries, N_keys].
nonbatched_bias: Shared bias, shape [N_queries, N_keys].
mask: A mask for the attention, shape [batch_size, N_heads, N_queries, N_keys].
nonbatched_bias: Shared bias, shape [N_heads, N_queries, N_keys].
Returns:
A float32 tensor of shape [batch_size, N_queries, output_dim].
Expand Down

0 comments on commit cb98745

Please sign in to comment.