pytagi.nn.batch_norm#

Classes#

BatchNorm2d

Applies 2D Batch Normalization.

Module Contents#

class pytagi.nn.batch_norm.BatchNorm2d(num_features: int, eps: float = 1e-05, momentum: float = 0.9, bias: bool = True, gain_weight: float = 1.0, gain_bias: float = 1.0)[source]#

Bases: pytagi.nn.base_layer.BaseLayer

Applies 2D Batch Normalization.

Batch Normalization normalizes the inputs of a layer by re-centering and re-scaling them.

Parameters:
  • num_features (int) – The number of features in the input tensor.

  • eps (float) – A small value added to the variance to avoid division by zero. Defaults to 1e-5.

  • momentum (float) – The momentum for the running mean and variance. Defaults to 0.9.

  • bias (bool) – Whether to include a learnable bias term. Defaults to True.

  • gain_weight (float) – Initial value for the gain (scale) parameter. Defaults to 1.0.

  • gain_bias (float) – Initial value for the bias (shift) parameter. Defaults to 1.0.

Initializes the BatchNorm2d layer.

get_layer_info() str[source]#

Retrieves detailed information about the BatchNorm2d layer.

Returns:

A string containing the layer’s information, typically delegated

to the C++ backend implementation.

Return type:

str

get_layer_name() str[source]#

Retrieves the name of the BatchNorm2d layer.

Returns:

The name of the layer, typically delegated to the C++ backend implementation.

Return type:

str

init_weight_bias()[source]#

Initializes the learnable weight (scale/gain) and bias (shift/offset) parameters of the batch normalization layer. This operation is delegated to the C++ backend.