Every neural network, no matter how large, is built from a single repeating unit: the neuron. Understand one neuron and you understand the whole stack.
Biological vs. artificial neuron
The artificial neuron is loosely inspired by biology. Dendrites → inputs, synaptic strength → weights, cell body → weighted sum + activation, axon → output.
Biological neuron
Artificial neuron
Each input x corresponds to a dendrite, each weight w corresponds to synaptic strength, and the summation node is the cell body deciding whether to "fire."
Interactive: adjust weights & inputs
Drag the sliders to change input values and weights. Watch the weighted sum z and final output a update in real time.
Inputs (x)
x₁1.5
x₂−2.0
x₃0.8
Weights (w) & bias (b)
w₁0.40
w₂−0.50
w₃0.30
b0.10
Activation function
Live computation
Weighted sum z
1.94
Output a = f(z)
1.94
Neuron
fired
The forward pass — step by step
Data flows left to right through the neuron. Click through each stage using the example values (x₁=1.5, x₂=−2.0, x₃=0.8 with ReLU).
Key equations
Weighted sum
z = w₁x₁ + w₂x₂ + … + wₙxₙ + b = Σᵢ(wᵢ · xᵢ) + b
Each input multiplied by its weight, summed, then shifted by the bias.
ReLU activation
a = max(0, z)
Returns z if positive, 0 otherwise. Most widely used in deep networks.
Sigmoid activation
a = 1 / (1 + e⁻ᶻ)
Squashes output to (0, 1). Used in binary classification output layers.
Tanh activation
a = (eᶻ − e⁻ᶻ) / (eᶻ + e⁻ᶻ)
Squashes output to (−1, 1). Zero-centred — often preferred over sigmoid in hidden layers.
Intuition: Think of weights as "importance dials" and the bias as a "default activation level" that lets the neuron fire even when all inputs are zero.