GitHub
deepbox/nn

Normalization & Dropout

Regularization and normalization layers that stabilize training and prevent overfitting.

BatchNorm1d

extends Module

Batch normalization for 2D inputs (batch, features). Normalizes each feature across the batch to zero mean and unit variance. Learns affine parameters γ and β. Uses running statistics in eval mode.

LayerNorm

extends Module

Layer normalization. Normalizes across the feature dimension (not the batch). More stable than BatchNorm for small batches and sequence models.

Dropout

extends Module

Randomly zeros elements with probability p during training. Scales remaining values by 1/(1−p). Disabled in eval mode. Prevents co-adaptation of neurons.

Batch Norm

ŷ = γ · (x − μ_B) / √(σ²_B + ε) + β

Where:

  • μ_B, σ²_B = Batch mean and variance
  • γ, β = Learnable scale and shift
normalization.ts
import { BatchNorm1d, LayerNorm, Dropout, Sequential, Linear, ReLU } from "deepbox/nn";const model = new Sequential(  new Linear(10, 64),  new BatchNorm1d(64),  new ReLU(),  new Dropout(0.3),  new Linear(64, 32),  new LayerNorm(32),  new ReLU(),  new Linear(32, 1));model.train(); // Dropout active, BatchNorm uses batch statsmodel.eval();  // Dropout disabled, BatchNorm uses running stats