Skip to content

Index

instanovo

__version__ = '1.2.2' module-attribute

cgroup_max_mem_limit_path = '/sys/fs/cgroup/memory.max' module-attribute

cgroup_high_mem_limit_path = '/sys/fs/cgroup/memory.high' module-attribute

high_limit = open(cgroup_high_mem_limit_path).read() module-attribute

max_limit = open(cgroup_max_mem_limit_path).read() module-attribute

hard_limit = resource.RLIM_INFINITY if max_limit == 'max\n' else int(max_limit) module-attribute

soft_limit = hard_limit if high_limit == 'max\n' else int(high_limit) module-attribute

terminal_width = shutil.get_terminal_size(fallback=(175, 24)).columns module-attribute

console = Console(width=terminal_width) module-attribute

get_rank() -> int | None

Get the current process rank in distributed training.

RETURNS DESCRIPTION
int | None

int | None: The process rank if in distributed training, None otherwise

set_rank(rank: int | None) -> None

Set the current process rank for distributed training.

PARAMETER DESCRIPTION
rank

The process rank to set, or None for non-distributed training

TYPE: int | None

RAISES DESCRIPTION
RuntimeError

If the rank has already been set