30
Normalization
Regularization
Dropout
Neural Networks
Normalization & Dropout Layers
Normalization and dropout are essential techniques for training deep networks. This example demonstrates three layers: BatchNorm1d (normalizes activations across the batch dimension — stabilizes training by reducing internal covariate shift, uses running statistics in eval mode), LayerNorm (normalizes across features within each sample — standard in Transformers, independent of batch size), and Dropout (randomly zeros elements with probability p during training — prevents co-adaptation of neurons, disabled during eval). For each layer, the example shows behavior in both train and eval modes and explains when to use each.
Deepbox Modules Used
deepbox/ndarraydeepbox/nnWhat You Will Learn
- BatchNorm1d normalizes across batch — uses running stats in eval mode
- LayerNorm normalizes across features — batch-size independent, used in Transformers
- Dropout randomly zeros p% of elements during training for regularization
- Always switch to .eval() mode before inference — affects BatchNorm and Dropout
- BatchNorm before activation, LayerNorm after — both are valid patterns
Source Code
30-normalization-dropout/index.ts
1import { randn, tensor } from "deepbox/ndarray";2import { BatchNorm1d, LayerNorm, Dropout } from "deepbox/nn";34console.log("=== Normalization & Dropout ===\n");56// BatchNorm1d: normalize across batch7const bn = new BatchNorm1d(4);8const x = randn([3, 4]); // [batch=3, features=4]9console.log("Input:", x.shape);1011bn.train();12const bnTrain = bn.forward(x);13console.log("BatchNorm (train):", bnTrain.shape);1415bn.eval();16const bnEval = bn.forward(x);17console.log("BatchNorm (eval):", bnEval.shape);1819// LayerNorm: normalize across features20const ln = new LayerNorm(4);21const lnOut = ln.forward(x);22console.log("\nLayerNorm:", lnOut.shape);2324// Dropout: randomly zero elements during training25const drop = new Dropout(0.5); // 50% dropout26drop.train();27const dropTrain = drop.forward(x);28console.log("\nDropout (train) - some zeros:");29console.log(dropTrain.toString());3031drop.eval();32const dropEval = drop.forward(x);33console.log("\nDropout (eval) - passthrough:");34console.log(dropEval.toString());Console Output
$ npx tsx 30-normalization-dropout/index.ts
=== Normalization & Dropout ===
Input: [3, 4]
BatchNorm (train): [3, 4]
BatchNorm (eval): [3, 4]
LayerNorm: [3, 4]
Dropout (train) - some zeros:
[[0.00, 1.43, 0.00, -0.86],
[-0.52, 0.00, 1.94, 0.00],
[0.00, 0.68, 0.00, 2.14]]
Dropout (eval) - passthrough:
[[-0.73, 0.71, -1.22, -0.43],
[-0.26, 0.34, 0.97, -1.05],
[0.56, 0.34, -0.88, 1.07]]