1. Why CNN ? Translational Invariant Preserve Spatial Information Shared Weights: reduce memory requirement and computation time. 2. Batch Normalization Batch normalization is a technique used in deep learning to improve the performance and stability of neural networks. It normalizes the activations of the neurons in a layer for each mini-batch of training data. This has the effect of reducing the internal covariate shift, which is the change in the distribution of the inputs to a layer caused by the changing weights during training. Normalization is done by scaling the features activation of a layer to have mean zero and standard deviation to one Batch normalization can improve the convergence of the training process. It can also regularize the model, which can improve its generalization performance on unseen data. Batch normalization is typically applied before the activation function of a lay
(a)Final Blended Image (b) Background Image (c)Foreground Image Alpha blending Alpha blending is the process of overlaying a foreground image with transparency over a background Image. The transparent image is generally a PNG image.It consists of four channels (RGBA).The fourth channel is the alpha channel which holds the transparency magnitude. Image (b) is a background image and image (c) is the foreground / overlay image. Image (a) is the final blended image obtained by blending the overalay image using the alpha mask. Below is the image(Fig d) of the alpha channel of the overlay image. (d).Alpha Channel At every pixel of the image, we blend the background and foreground image color(F) and background color (B) using the alpha mask . At every pixel value of alpha lie in range(0,255), a pixel intensity of 0 means black color and pixel instensity of 255 means whit