1. Why CNN ? Translational Invariant Preserve Spatial Information Shared Weights: reduce memory requirement and computation time. 2. Batch Normalization Batch normalization is a technique used in deep learning to improve the performance and stability of neural networks. It normalizes the activations of the neurons in a layer for each mini-batch of training data. This has the effect of reducing the internal covariate shift, which is the change in the distribution of the inputs to a layer caused by the changing weights during training. Normalization is done by scaling the features activation of a layer to have mean zero and standard deviation to one Batch normalization can improve the convergence of the training process. It can also regularize the model, which can improve its generalization performance on unseen data. Batch normalization is typically applied before the activation function of a lay...
PyTech Solutions
Computer Vision | Machine Learning | Deep Learning