Skip to content

Commit 7d214a3

Browse files
shucongzhangShucong Zhang/Embedded AI /SRUK/Engineer/Samsung Electronics
andauthored
fix unnecessary pos_emb for RoPE (#2963)
Co-authored-by: Shucong Zhang/Embedded AI /SRUK/Engineer/Samsung Electronics <s1.zhang@sruk-ccn4.eu.corp.samsungelectronics.net>
1 parent 308527c commit 7d214a3

1 file changed

Lines changed: 1 addition & 4 deletions

File tree

speechbrain/lobes/models/transformer/Conformer.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -802,10 +802,7 @@ def forward_streaming(
802802
The attention values.
803803
"""
804804

805-
if (
806-
self.attention_type == "RelPosMHAXL"
807-
or self.attention_type == "RoPEMHA"
808-
):
805+
if self.attention_type == "RelPosMHAXL":
809806
if pos_embs is None:
810807
raise ValueError(
811808
f"The chosen attention type for the Conformer is {self.attention_type}. For this attention type, the positional embeddings are mandatory"

0 commit comments

Comments
 (0)