Skip to content

[BUG] Import Error: name '_disable_dynamo_if_unsupported' is not defined #7514

@Thinksky5124

Description

@Thinksky5124

Describe the bug
When torch < 2.4.0 and deepspeed >= 0.17.5,import deepspeed will lead error:

Python 3.10.15 (main, Oct  3 2024, 07:27:34) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import deepspeed
[2025-08-25 10:30:30,981] [INFO] [real_accelerator.py:260:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[WARNING] ZenFlow disabled: torch internal optimizer symbols could not be imported: cannot import name '_disable_dynamo_if_unsupported' from 'torch.optim.optimizer' (/home/users/wujun.wen/miniconda3/envs/torch230_py310/lib/python3.10/site-packages/torch/optim/optimizer.py)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/users/wujun.wen/miniconda3/envs/torch230_py310/lib/python3.10/site-packages/deepspeed/__init__.py", line 25, in <module>
    from . import ops
  File "/home/users/wujun.wen/miniconda3/envs/torch230_py310/lib/python3.10/site-packages/deepspeed/ops/__init__.py", line 6, in <module>
    from . import adam
  File "/home/users/wujun.wen/miniconda3/envs/torch230_py310/lib/python3.10/site-packages/deepspeed/ops/adam/__init__.py", line 9, in <module>
    from .zenflow_torch_adam import ZenFlowSelectiveAdamW
  File "/home/users/wujun.wen/miniconda3/envs/torch230_py310/lib/python3.10/site-packages/deepspeed/ops/adam/zenflow_torch_adam.py", line 685, in <module>
    @_disable_dynamo_if_unsupported(single_tensor_fn=_single_tensor_adamw)
NameError: name '_disable_dynamo_if_unsupported' is not defined

It is becase a wrong version check for torch in deepspeed/ops/adam/zenflow_torch_adam.py L13 .

from deepspeed.utils.torch import required_torch_version

# Check if we have PyTorch >= 2.0 for ZenFlow features
_ZENFLOW_AVAILABLE = required_torch_version(min_version=2.1)

if _ZENFLOW_AVAILABLE:
    try:
        from torch.optim.optimizer import (
            _default_to_fused_or_foreach,
            _disable_dynamo_if_unsupported,
            _get_capturable_supported_devices,
            _get_value,
            _stack_if_compiling,
            _view_as_real,
            DeviceDict,
            Optimizer,
        )
    except ImportError as e:
        print(f"[WARNING] ZenFlow disabled: torch internal optimizer symbols could not be imported: {e}")
        _ZENFLOW_AVAILABLE = False
else:
    # safe disable dynamo if unsupported
    def _disable_dynamo_if_unsupported(**kwargs):

        def wrapper(fn):
            return fn

        return wrapper

    _ZENFLOW_AVAILABLE = False

Function _disable_dynamo_if_unsupported is avaliable from torch >= 2.4.0. And try except not registry _disable_dynamo_if_unsupported function when import error occur.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtraining

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions