Technical Terms
Layer Normalization
Definition
A normalization step used inside transformer blocks to stabilise activations and help deep models train and run reliably.
In Plain English
A balancing step that keeps the math inside deep models from drifting too wildly.