pytagi.nn.layer_norm#
Classes#
Implements Layer Normalization by normalizing the inputs across the |
Module Contents#
- class pytagi.nn.layer_norm.LayerNorm(normalized_shape: List[int], eps: float = 0.0001, bias: bool = True)[source]#
Bases:
pytagi.nn.base_layer.BaseLayer
Implements Layer Normalization by normalizing the inputs across the features dimension. It inherits from BaseLayer.
Initializes the LayerNorm layer.
- Parameters:
normalized_shape – The shape of the input to normalize over (e.g., the size of the feature dimension). Expected to be a list of integers.
eps – A small value added to the denominator for numerical stability to prevent division by zero. Defaults to 1e-4.
bias – If True, the layer will use an additive bias (beta) during normalization. Defaults to True.
- get_layer_info() str [source]#
Retrieves a descriptive string containing information about the layer’s configuration (e.g., its shape and parameters) from the C++ backend.