You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add batch divisibility check for VAE input sharding (#316)
* Adding check for batch size divisibility before sharding video condition tensor
* pyink checks
* Removed unused var
* Moving commit retrieval to before JAX setup init
* fix
* replaced boundary_timestep with ratio in wan2.2 t2v
* replaced boundary_timestep with ratio in wan2.2 t2v
max_logging.log(f"Adding sequence sharding to q and kv if not already present because {raw_keys['attention']}=='ring' or {raw_keys['attention_sharding_uniform']} is set.")
204
+
max_logging.log(
205
+
f"Adding sequence sharding to q and kv if not already present because {raw_keys['attention']}=='ring' or {raw_keys['attention_sharding_uniform']} is set."
0 commit comments