Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] after resume training, change param_scheduler is no effective #1565

Open
zhangchao-s opened this issue Jul 25, 2024 · 1 comment
Open

Comments

@zhangchao-s
Copy link

📚 The doc issue

in my work , I already train 48 epochs before, and I want to use the epoch_48.pth to continue my work to 56 epochs. but when I change param_scheduler in config, I see the config lr did not change. what problem is it ? How can i solve it

before change config:
param_scheduler= [
dict(
type='LinearLR',
start_factor=1.0 / 3,
by_epoch=False,
begin=0,
end=500),
dict(
type='MultiStepLR',
begin=0,
end=48,
by_epoch=True,
milestones=[32, 44],
gamma=0.1),
]

after change config(I want):
param_scheduler= [
dict(
type='LinearLR',
start_factor=1.0 / 3,
by_epoch=False,
begin=0,
end=500),
dict(
type='MultiStepLR',
begin=0,
end=56,
by_epoch=True,
milestones=[32, 44, 56],
gamma=0.1),
]

image

In epoch 48, the learning rate was 1e-5, but the configuration was changed and the training continued, and the rest of the epoch was still 1e-5, with no change.

Suggest a potential alternative/fix

No response

@zhangchao-s
Copy link
Author

How do I make sure that when I resume training, I can reset the learning rate scheduler to take effect with the new configuration?

@zhangchao-s zhangchao-s changed the title [Docs] after load_checkpoint, change param_scheduler is no effective [Docs] after resume training, change param_scheduler is no effective Jul 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant