mindformers.core¶
MindFormers Core.
mindformers.core¶
Build context. |
|
Context initialization for MindSpore. Args: use_parallel (Optional[Union[bool]]): Whether to use distributed training. Default: False. context_config (Optional[Union[dict, ContextConfig]]): Context Config For Running Environment. Default: None. parallel_config (Optional[Union[dict, ParallelContextConfig]]): Parallel Config For Running Environment. Default: None. |
mindformers.core.callback¶
Checkpoint Monitor For Save LossScale. |
|
Loss Monitor for classification. |
|
Obs Monitor For Local and AICC. |
|
Summary Monitor For AICC and Local. |
|
Profile analysis in training. |
|
Evaluate Callback used in training progress. |
mindformers.core.loss¶
Calculate the cross entropy loss. |
|
L1Loss for parallel. |
|
Calculate the MSE loss with given logits and labels. |
|
Calculate the SoftTargetCrossEntropy loss with given logits and labels. |
mindformers.core.lr¶
Constant Warm Up Learning Rate. |
|
Cosine with Restarts and Warm Up Learning Rate. |
|
Cosine with Warm Up Learning Rate. |
|
Linear with Warm Up Learning Rate. |
|
Polynomial with Warm Up Learning Rate. |
|
Set the learning rate of each parameter group using a cosine annealing schedule, where \(\eta_{max}\) is set to the initial lr and \(T_{cur}\) is the number of epochs since the last restart in SGDR: |
|
Set the learning rate of each parameter group using a cosine annealing schedule, where \(\eta_{max}\) is set to the initial lr, \(T_{cur}\) is the number of epochs since the last restart and \(T_{i}\) is the number of epochs between two warm restarts in SGDR: |
mindformers.core.metric¶
Compute the f1, precision and recall score of each entity |
|
Compute the f1, precision and recall score of each entity |
|
Compute the loss and PPL of each entity |
|
Compute the f1, precision and recall score of each entity |
mindformers.core.optim¶
This class is almost same with the mindspore's AdamWeightDecay implements, the |