-
Notifications
You must be signed in to change notification settings - Fork 6.5k
fix the crash in Wan-AI/Wan2.2-TI2V-5B-Diffusers if CP is enabled #12562
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Wang, Yi <[email protected]>
|
torchrun --nproc-per-node 2 test.py crash stack: |
|
@yiyixuxu @sayakpaul please help review |
|
Could you also supplement an output with the fix? |
| ) | ||
| if ts_seq_len is not None: | ||
| # Check if running under context parallel and split along seq_len dimension | ||
| if hasattr(self, '_parallel_config') and self._parallel_config is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you elaborate why this is needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
when cp is enabled, seq_len is split, timestep_shape is [batch_size, seq_len, 6, inner_dim], so should be split in dim_1 as well since hidden state is split in seq_len dim as well. or else shape miss match will occur
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do you think we can just change _cp_plan? https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/transformers/transformer_wan.py#L546
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you mean split timestep in forward? adding
"": {
"timestep": ContextParallelInput(split_dim=1, split_output=False)
}, to _cp_plan will make 5B work, but 14B fail since 5B timestep dims is 2. 14 timestep dims is 1.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, this is an interesting situation. To tackle these, I think we might have to revisit the ContextParallelInput and ContextParallelOutput definitions a bit more.
If we had a way to tell the partitioner that the input might have "dynamic" dimensions depending on the model configs (like in this case), and what it should do if that's the case, it might be more flexible as a solution.
@DN6 curious to know what you think.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes a lot of things easier, for sure!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
expand_timesteps is not passed in WanTransformer3DModel init, so, no way to judge if timestep.dim is 2 or 1 currently.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can add a config if we want to
i want to hear @DN6 's thoughts on this first though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
potentially, we can add cp_plan directly as a config, allow model owner to overridee it I think (in this case, we would send a PR into wan repo, i think it'd be ok)
It's also very in line with transformers does it btw.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Could you please explain the changes and also provide an example output?

fix the crash when testing CP for wan2.2-TI2V-5B
test script: