mindpose.scheduler

class mindpose.scheduler.WarmupCosineDecayLR(lr, total_epochs, steps_per_epoch, warmup=0, min_lr=0.0)[source]

Bases: LearningRateSchedule

CosineDecayLR with warmup.

Parameters:
  • lr (float) – initial learning rate.

  • total_epochs (int) – The number of total epochs of learning rate.

  • steps_per_epoch (int) – The number of steps per epoch.

  • warmup (Union[int, float]) – If it is a interger, it means the number of warm up steps of learning rate. If it is a decimal number, it means the fraction of total steps to warm up. Default = 0

  • min_lr (float) – Lower lr bound. Default = 0

Inputs:
global_step: Global step
Outpus:
lr: Learning rate at that step
class mindpose.scheduler.WarmupMultiStepDecayLR(lr, total_epochs, steps_per_epoch, milestones, decay_rate=0.1, warmup=0)[source]

Bases: LearningRateSchedule

Multi-step decay with warmup.

Parameters:
  • lr (float) – initial learning rate.

  • total_epochs (int) – The number of total epochs of learning rate.

  • steps_per_epoch (int) – The number of steps per epoch.

  • milestones (List[int]) – The epoch number where the learning rate dacay by one time

  • decay_rate (float) – Decay rate. Default = 0.1

  • warmup (Union[int, float]) – If it is a interger, it means the number of warm up steps of learning rate. If it is a decimal number, it means the fraction of total steps to warm up. Default = 0

Inputs:
global_step: Global step
Outpus:
lr: Learning rate at that step
mindpose.scheduler.create_lr_scheduler(name, lr, total_epochs, steps_per_epoch, warmup=0, **kwargs)[source]

Create learning rate scheduler.

Parameters:
  • name (str) – Name of the scheduler. Default: warmup_cosine_decay

  • lr (float) – initial learning rate.

  • total_epochs (int) – The number of total epochs of learning rate.

  • steps_per_epoch (int) – The number of steps per epoch.

  • warmup (Union[int, float]) – If it is a interger, it means the number of warm up steps of learning rate. If it is a decimal number, it means the fraction of total steps to warm up. Default = 0

  • **kwargs (Any) – Arguments feed into the corresponding scheduler

Return type:

LearningRateSchedule

Returns:

Learning rate scheduler