Skip to content

Commit 768416a

Browse files
committed
Fix pylink error
1 parent 7375d6e commit 768416a

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

src/maxdiffusion/models/ltx2/attention_ltx2.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -478,7 +478,7 @@ def __call__(
478478

479479
# 4. Attention
480480
# NNXAttentionOp expects flattened input [B, S, InnerDim] for flash kernel
481-
attn_output = self.attention_op.apply_attention(query=query, key=key, value=value, attention_mask=attention_mask)
481+
attn_output = self.attention_op.apply_attention(query=query, key=key, value=value)
482482

483483
# 7. Output Projection
484484
hidden_states = self.to_out(attn_output)

0 commit comments

Comments
 (0)