config

class common.infer.lightning.config.LightningInferenceSubtaskConfig(output_dir='${hydra:runtime.output_dir}', data_dir='${oc.env:AI_REPO_PATH}/data/', device='cpu', seed=0, ckpt_path='last')[source]

Bases: InferenceSubtaskConfig

.

Parameters:

ckpt_path (str | None, default: 'last') – The path to a Lightning checkpoint to load the model from.

class common.infer.lightning.config.LightningInferenceTaskConfig(litmodule=<class 'types.PartialBuilds_BaseLitModule'>, config=<class 'types.Builds_LightningInferenceSubtaskConfig'>, defaults=<factory>)[source]

Bases: Config