1. Why CNN ?
- Translational Invariant
- Preserve Spatial Information
- Shared Weights: reduce memory requirement and computation time.
2. Batch Normalization
- Batch normalization is a technique used in deep learning to improve the performance and stability of neural networks.
- It normalizes the activations of the neurons in a layer for each mini-batch of training data. This has the effect of reducing the internal covariate shift, which is the change in the distribution of the inputs to a layer caused by the changing weights during training. Normalization is done by scaling the features activation of a layer to have mean zero and standard deviation to one
- Batch normalization can improve the convergence of the training process. It can also regularize the model, which can improve its generalization performance on unseen data.
- Batch normalization is typically applied before the activation function of a layer.
3. Commonly used Activation Functions(Know More)
- Sigmoid
- Hyperbolic Tangent Function (tanh(x))
- Rectified linear activation (ReLU)
- Leaky rectified linear activation (Leaky ReLU)
- Swish
- Mish
3.Bias and Variance
- In machine learning, bias and variance are two sources of error that can affect the accuracy of a model. Bias is the difference between the predicted values of a model and the true values of the underlying data. It occurs when a model is overly simplified or when it makes assumptions about the data that are not correct. High bias can lead to underfitting, where a model is unable to accurately capture the patterns in the data.
- Variance, on the other hand, is the variability of a model's predictions for a given input. It occurs when a model is too complex and is able to fit the noise in the data, rather than the underlying signal. High variance can lead to overfitting, where a model is too specific to the training data and is unable to generalize to new data.
- In general, a model with low bias and low variance is desirable because it can accurately capture the underlying patterns in the data and make reliable predictions on new data. Finding the right balance between bias and variance is an important aspect of model training and evaluation.
Comments
Post a Comment