Skip to content

ding.torch_utils.lr_scheduler

ding.torch_utils.lr_scheduler

get_lr_ratio(epoch, warmup_epochs, learning_rate, lr_decay_epochs, min_lr)

Overview

Get learning rate ratio for each epoch.

Arguments: - epoch (:obj:int): Current epoch. - warmup_epochs (:obj:int): Warmup epochs. - learning_rate (:obj:float): Learning rate. - lr_decay_epochs (:obj:int): Learning rate decay epochs. - min_lr (:obj:float): Minimum learning rate.

cos_lr_scheduler(optimizer, learning_rate, warmup_epochs=5, lr_decay_epochs=100, min_lr=6e-05)

Overview

Cosine learning rate scheduler.

Arguments: - optimizer (:obj:torch.optim.Optimizer): Optimizer. - learning_rate (:obj:float): Learning rate. - warmup_epochs (:obj:float): Warmup epochs. - lr_decay_epochs (:obj:float): Learning rate decay epochs. - min_lr (:obj:float): Minimum learning rate.

Full Source Code

../ding/torch_utils/lr_scheduler.py

1from functools import partial 2import math 3 4import torch.optim 5from torch.optim.lr_scheduler import LambdaLR 6 7 8def get_lr_ratio(epoch: int, warmup_epochs: int, learning_rate: float, lr_decay_epochs: int, min_lr: float) -> float: 9 """ 10 Overview: 11 Get learning rate ratio for each epoch. 12 Arguments: 13 - epoch (:obj:`int`): Current epoch. 14 - warmup_epochs (:obj:`int`): Warmup epochs. 15 - learning_rate (:obj:`float`): Learning rate. 16 - lr_decay_epochs (:obj:`int`): Learning rate decay epochs. 17 - min_lr (:obj:`float`): Minimum learning rate. 18 """ 19 20 # 1) linear warmup for warmup_epochs. 21 if epoch < warmup_epochs: 22 return epoch / warmup_epochs 23 # 2) if epoch> lr_decay_epochs, return min learning rate 24 if epoch > lr_decay_epochs: 25 return min_lr / learning_rate 26 # 3) in between, use cosine decay down to min learning rate 27 decay_ratio = (epoch - warmup_epochs) / (lr_decay_epochs - warmup_epochs) 28 assert 0 <= decay_ratio <= 1 29 coefficient = 0.5 * (1.0 + math.cos(math.pi * decay_ratio)) 30 return (min_lr + coefficient * (learning_rate - min_lr)) / learning_rate 31 32 33def cos_lr_scheduler( 34 optimizer: torch.optim.Optimizer, 35 learning_rate: float, 36 warmup_epochs: float = 5, 37 lr_decay_epochs: float = 100, 38 min_lr: float = 6e-5 39) -> torch.optim.lr_scheduler.LambdaLR: 40 """ 41 Overview: 42 Cosine learning rate scheduler. 43 Arguments: 44 - optimizer (:obj:`torch.optim.Optimizer`): Optimizer. 45 - learning_rate (:obj:`float`): Learning rate. 46 - warmup_epochs (:obj:`float`): Warmup epochs. 47 - lr_decay_epochs (:obj:`float`): Learning rate decay epochs. 48 - min_lr (:obj:`float`): Minimum learning rate. 49 """ 50 51 return LambdaLR( 52 optimizer, 53 partial( 54 get_lr_ratio, 55 warmup_epochs=warmup_epochs, 56 lr_decay_epochs=lr_decay_epochs, 57 min_lr=min_lr, 58 learning_rate=learning_rate 59 ) 60 )