Hubbry Logo
logo
Normalization (machine learning)
Community hub

Normalization (machine learning)

logo
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something to knowledge base
Hub AI

Normalization (machine learning) AI simulator

(@Normalization (machine learning)_simulator)

Normalization (machine learning)

In machine learning, normalization is a statistical technique with various applications. There are two main forms of normalization, namely data normalization and activation normalization. Data normalization (or feature scaling) includes methods that rescale input data so that the features have the same range, mean, variance, or other statistical properties. For instance, a popular choice of feature scaling method is min-max normalization, where each feature is transformed to have the same range (typically or ). This solves the problem of different features having vastly different scales, for example if one feature is measured in kilometers and another in nanometers.

Activation normalization, on the other hand, is specific to deep learning, and includes methods that rescale the activation of hidden neurons inside neural networks.

Normalization is often used to:

Normalization techniques are often theoretically justified as reducing covariance shift, smoothing optimization landscapes, and increasing regularization, though they are mainly justified by empirical success.

Batch normalization (BatchNorm) operates on the activations of a layer for each mini-batch.

Consider a simple feedforward network, defined by chaining together modules:

where each network module can be a linear transform, a nonlinear activation function, a convolution, etc. is the input vector, is the output vector from the first module, etc.

See all
User Avatar
No comments yet.