15
Activations
Neural Networks
Activation Functions
Activation functions introduce non-linearity into neural networks, enabling them to learn complex patterns. This example computes every activation function in Deepbox on the same input tensor [-2, -1, 0, 1, 2] so you can compare their outputs side-by-side. You will see that ReLU zeroes negatives, Sigmoid squashes to (0,1), Softmax produces a probability distribution, GELU smoothly approximates ReLU, Mish and Swish are self-gated, ELU and LeakyReLU handle negatives differently, and LogSoftmax is numerically stable for NLL loss. The example also generates an SVG overlay plot of all activations for visual comparison.
Deepbox Modules Used
deepbox/ndarraydeepbox/plotWhat You Will Learn
- ReLU is the default choice — simple and effective
- Sigmoid for binary output, Softmax for multi-class probabilities
- GELU/Mish/Swish are modern smooth alternatives to ReLU
- ELU and LeakyReLU prevent 'dying ReLU' by allowing negative gradients
- LogSoftmax is numerically stable for NLL loss computation
Source Code
15-activation-functions/index.ts
1import { tensor, relu, sigmoid, softmax, gelu, mish, swish, elu, leakyRelu, softplus, logSoftmax } from "deepbox/ndarray";23const t = tensor([-2, -1, 0, 1, 2]);45console.log("Input: ", t.toString());6console.log("ReLU: ", relu(t).toString());7console.log("Sigmoid: ", sigmoid(t).toString());8console.log("GELU: ", gelu(t).toString());9console.log("Mish: ", mish(t).toString());10console.log("Swish: ", swish(t).toString());11console.log("ELU(α=1): ", elu(t, 1.0).toString());12console.log("LeakyReLU: ", leakyRelu(t, 0.1).toString());13console.log("Softplus: ", softplus(t).toString());1415const logits = tensor([[2.0, 1.0, 0.1]]);16console.log("\nSoftmax: ", softmax(logits, 1).toString());17console.log("LogSoftmax: ", logSoftmax(logits, 1).toString());Console Output
$ npx tsx 15-activation-functions/index.ts
Input: [-2, -1, 0, 1, 2]
ReLU: [0, 0, 0, 1, 2]
Sigmoid: [0.119, 0.269, 0.5, 0.731, 0.881]
GELU: [-0.045, -0.159, 0, 0.841, 1.955]
Mish: [-0.254, -0.303, 0, 0.865, 1.944]
Swish: [-0.238, -0.269, 0, 0.731, 1.762]
ELU(α=1): [-0.865, -0.632, 0, 1, 2]
LeakyReLU: [-0.2, -0.1, 0, 1, 2]
Softplus: [0.127, 0.313, 0.693, 1.313, 2.127]
Softmax: [[0.659, 0.243, 0.099]]
LogSoftmax: [[-0.417, -1.417, -2.317]]