Skip to content

Device error on 0.0.27.dev844 #1064

@Cospui

Description

@Cospui

🐛 Bug

[rank1]: ValueError: Attention bias and Query/Key/Value should be on the same device
[rank1]:   query.device: cuda:1
[rank1]:   attn_bias   : cuda:0

Command

To Reproduce

Steps to reproduce the behavior:

out = xformers.ops.fmha.memory_efficient_attention(q, k, v, attn_bias=attn_bias)

Code works fine on version 0.0.27.dev840 and 0.0.26.post1. After upgrading to new version, the code fails with the above error.

Expected behavior

Environment

Please copy and paste the output from the
environment collection script from PyTorch
(or fill out the checklist below manually).

You can run the script with:

# For security purposes, please check the contents of collect_env.py before running it.
python -m torch.utils.collect_env
  • PyTorch Version (e.g., 1.0):
  • OS (e.g., Linux):
  • How you installed PyTorch (conda, pip, source):
  • Build command you used (if compiling from source):
  • Python version:
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information:

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions