π General Code Snippets in ML

π₯ Sigmoid Function

A function that computes gradients to optimize loss functions using backpropagation

    def arr2vec(arr, target):
"""
Argument:
image -- a numpy array of shape (length, height, depth)

Returns:
v -- a vector of shape (length*height*depth, 1)
"""

v = image.reshape(image.shape[0] * image.shape[1] * image.shape[2], 1)

return v

π₯ Normalizing Rows

Dividing each row vector of x by its norm.

π¨ Softmax Function

A normalizing function used when the algorithm needs to classify two or more classes

π€ΈββοΈ L1 Loss Function

The loss is used to evaluate the performance of the model. The bigger the loss is, the more different that predictions ( yΜ ) are from the true values ( y ). In deep learning, we use optimization algorithms like Gradient Descent to train the model and to minimize the cos

π€ΈββοΈ L2 Loss Function

The loss is used to evaluate the performance of the model. The bigger the loss is, the more different that predictions ( yΜ ) are from the true values ( y ). In deep learning, we use optimization algorithms like Gradient Descent to train the model and to minimize the cost.

πββοΈ Propagation Function

Doing the "forward" and "backward" propagation steps for learning the parameters.

The goal is to learn Ο and b by minimizing the cost function J. For a parameter Ο

πΈ Basic Code Snippets for Simple NN

Functions of 2-layer NN

Input layer, 1 hidden layer and output layer

π Parameter Initialization

Initializing Ws and bs, Ws must be initialized randomly in order to do symmetry-breaking, we can do zero initalization for bs

Each layer accepts the input data, processes it as per the activation function and passes to the next layer