Scheduler
scheduler
logger = ColorLog(console, __name__).logger
module-attribute
FinetuneScheduler(model_state_dict: dict, config: DictConfig, steps_per_epoch: int | None = None)
Scheduler for unfreezing parameters of a model.
| PARAMETER | DESCRIPTION |
|---|---|
model_state_dict
|
The state dictionary of the model.
TYPE:
|
config
|
The configuration for the scheduler.
TYPE:
|
steps_per_epoch
|
The number of steps per epoch.
TYPE:
|
model_state_dict = model_state_dict
instance-attribute
config = config
instance-attribute
steps_per_epoch = steps_per_epoch
instance-attribute
is_verbose = self.config.get('verbose', False)
instance-attribute
schedule = self._get_schedule()
instance-attribute
next_phase: dict[str, Any] | None = self.schedule.pop(0)
instance-attribute
step(global_step: int) -> None
Step the unfreezing scheduler.
| PARAMETER | DESCRIPTION |
|---|---|
global_step
|
The global step of the model.
TYPE:
|
WarmupScheduler(optimizer: torch.optim.Optimizer, warmup: int)
Bases: _LRScheduler
Linear warmup scheduler.
warmup = warmup
instance-attribute
get_lr() -> list[float]
Get the learning rate at the current step.
get_lr_factor(epoch: int) -> float
Get the LR factor at the current step.
CosineWarmupScheduler(optimizer: torch.optim.Optimizer, warmup: int, max_iters: int)
Bases: _LRScheduler
Learning rate scheduler with linear warm up followed by cosine shaped decay.
Parameters
optimizer : torch.optim.Optimizer Optimizer object. warmup : int The number of warm up iterations. max_iters : int The total number of iterations.
get_lr() -> list[float]
Get the learning rate at the current step.
get_lr_factor(epoch: int) -> float
Get the LR factor at the current step.