Skip to content

CLI --lr_scheduler doesn't work? #18663

@profPlum

Description

@profPlum

Bug description

When using LightningCLI + --lr_scheduler pytorch_lightning.cli.ReduceLROnPlateau --lr_scheduler.monitor=epoch : learning rate does not change during training (even when progress is stalled). I've tested it using the LR monitor on 10 different experiments and LR does not change.

What version are you seeing the problem on?

v1.9

How to reproduce the bug

Use LightningCLI + --lr_scheduler pytorch_lightning.cli.ReduceLROnPlateau --lr_scheduler.monitor=epoch +

def configure_optimizers(self):
   return torch.optim.Adam(self.parameters(), lr=self.learning_rate)

Error messages and logs

image image

Environment

Current environment
#- Lightning Component: LightningModule or Trainer?
#- PyTorch Lightning Version (e.g., 1.5.0): 1.9.0
#- PyTorch Version (e.g., 2.0): 1.10.0
#- Python version (e.g., 3.9): 3.8.16
#- OS (e.g., Linux): Linux
#- GPU models and configuration: V100
#- How you installed Lightning(`conda`, `pip`, source): pip

More info

I've seen it happen in version 1.9.0 but it's quite possible it applies to master as well.

cc @Borda @carmocca @mauvilsa

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingdocsDocumentation relatedhelp wantedOpen to be worked onlightningclipl.cli.LightningCLIver: 1.9.x

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions