Skip to content

Commit 618550c

Browse files
committed
remove commented out line.
1 parent 39b38cf commit 618550c

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

src/maxdiffusion/models/attention_flax.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ def cudnn_flash_attention(
206206
def wrap_flash_attention(query, key, value):
207207
return jax.vmap(self.dpa_layer)(query, key, value, mask=None)
208208

209-
out = wrap_flash_attention(query, key, value)#self.dpa_layer(query, key, value, mask=None)
209+
out = wrap_flash_attention(query, key, value)
210210
return self.reshape_data_from_cudnn_flash(out)
211211

212212
def apply_attention_dot(self, query: Array, key: Array, value: Array):

0 commit comments

Comments
 (0)