π±Introduction
πͺ Beginning to solve problems of computer vision with Tensorflow and Keras
Last updated
πͺ Beginning to solve problems of computer vision with Tensorflow and Keras
Last updated
The MNIST database: (Modified National Institute of Standards and Technology database)
π Fashion-MNIST is consisting of a training set of 60,000 examples and a test set of 10,000 examples
π¨ Types:
π’ MNIST: for handwritten digits
π Fashion-MNIST: for fashion
π Properties:
π Grayscale
28x28 px
10 different categories
The main purpose of activation function is to convert a input signal of a node in a NN to an output signal. That output signal now is used as a input in the next layer in the stack π₯
Values in MNIST are between 0-255 but neural networks work better with normalized data, so we can divide every value by 255 so the values are between 0,1.
There are multiple criterias to stop training process, we can specify number of epochs or a threshold or both
Epochs: number of iterations
Threshold: a threshold for accuracy or loss after each iteration
Threshold with maximum number of epochs
We can check the accuracy at the end of each epoch by Callbacks π₯
Term
Description
β° Sequential
That defines a SEQUENCE of layers in the neural network
β Flatten
Flatten just takes that square and turns it into a 1 dimensional set (used for input layer)
π· Dense
Adds a layer of neurons
π₯ Activation Function
A formula that introduces non-linear properties to our Network
β¨ Relu
An activation function by the rule: If X>0 return X, else return 0
π¨ Softmax
An activation function that takes a set of values, and effectively picks the biggest one