Training Loop Config¶
- class mlip.training.training_loop_config.TrainingLoopConfig(*, num_epochs: Annotated[int, Gt(gt=0)], num_gradient_accumulation_steps: Annotated[int, Gt(gt=0)] = 1, random_seed: int = 42, ema_decay: Annotated[float, Gt(gt=0.0), Le(le=1.0)] = 0.99, use_ema_params_for_eval: bool = True, eval_num_graphs: Annotated[int, FieldInfo(annotation=NoneType, required=True, metadata=[Gt(gt=0)])] | None = None, run_eval_at_start: bool = True)¶
Pydantic config holding all settings related to the
TrainingLoop
class.- num_epochs¶
Number of epoch to run.
- Type:
int
- num_gradient_accumulation_steps¶
Number of gradient steps to accumulate before taking an optimizer step. Default is 1.
- Type:
int
- random_seed¶
A random seed, by default set to 42.
- Type:
int
- ema_decay¶
The EMA decay rate, by default set to 0.99.
- Type:
float
- use_ema_params_for_eval¶
Whether to use the EMA parameters for evaluation, set to
True
by default.- Type:
bool
- eval_num_graphs¶
Number of validation set graphs to evaluate on. By default, this is set to
None
which means to evaluate on all the available graphs.- Type:
int | None
- run_eval_at_start¶
Whether to run an evaluation on the validation set before we start the first epoch. By default, it is set to
True
.- Type:
bool
- __init__(**data: Any) None ¶
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError
][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.self
is explicitly positional-only to allowself
as a field name.