πŸ•ΈοΈ
Deep Learning
  • πŸ’«Deep Learning Notes
  • πŸ’ΌPractical Tools
  • πŸ’ŽConcepts of Neural Networks
    • 🌱Introduction
    • πŸ”ŽThe Problem in General
    • πŸ‘·β€β™€οΈ Implementation Notes
    • πŸ“šCommon Concepts
    • πŸ’₯Activation Functions
    • 🎈Practical Aspects
    • πŸ‘©β€πŸ”§ NN Regularization
    • ✨Optimization Algorithms
    • 🎨Softmax Regression
    • πŸƒβ€β™€οΈ Introduction to Tensorflow
    • πŸ‘©β€πŸ’» Python Code Snippets
  • πŸ™‹β€β™€οΈ Hello World of Deep Learning with Neural Networks
    • 🌱Introduction
    • 🌐CNNs In Browser
  • πŸšͺIntroduction to Computer Vision
    • 🌱Introduction
  • 🚩Concepts of Convolutional Neural Networks
    • 🌱Introduction
    • πŸ“ŒCommon Concepts
    • 🌟Advanced Concepts
    • πŸ‘€Visualization
    • πŸ‘΅Classic Networks
    • ✨Other Approaches
    • πŸ•ΈοΈCommon Applications
  • πŸ‘©β€πŸ’» Works and Notes on CNNs
    • 🌱Introduction
  • πŸ’„Popular Strategies of Deep Learning
    • 🌱Introduction
    • πŸš™Transfer Learning
    • πŸ“šOther Strategies
  • 🀑Image Augmentation
    • 🌱Introduction
  • πŸ€Έβ€β™€οΈ Notes on Applied Machine Learning
    • 🌱Introduction
    • πŸ‘©β€πŸ”§ Notes on Structuring Machine Learning Projects
    • πŸ‘©β€πŸ« Implementation Guidelines
  • πŸ•΅οΈβ€β™€οΈ Basics of Object Detection
    • 🌱Introduction
    • β­•Region-Based CNNs
    • 🀳SSD and YOLO
    • πŸ€–TensorFlow Object Detection API
    • 🐞Model Debugging
  • ➰Sequence Models In Deep Learning
    • 🌱Introduction
    • πŸ“šGeneral Concepts
    • πŸ”„Recurrent Neural Networks
    • 🌌Vanishing Gradients with RNNs
    • 🌚Word Representation
    • πŸ’¬Mixed Info On NLP
  • πŸ’¬NLP
    • 🌱Introduction
  • πŸ’¬Applied NLP
    • πŸ™ŒπŸ» Handling texts
    • 🧩Regex
  • πŸ‘€Quick Visual Info
  • πŸ“šPDFs that I found and recommend
Powered by GitBook
On this page
  • πŸ‘©β€πŸ« Notation
  • πŸš€ One Hot Encoding

Was this helpful?

Export as PDF
  1. Sequence Models In Deep Learning

General Concepts

General Concepts of Sequence Models

πŸ‘©β€πŸ« Notation

In the context of text processing (e.g: Natural Language Processing NLP)

Symbol

Description

$$X^{}$$

The tth word in the input sequence

$$Y^{}$$

The tth word in the output sequence

$$X^{(i)}$$

The tth word in the ith input sequence

$$Y^{(i)}$$

The tth word in the ith output sequence

$$T^{(i)}_x$$

The length of the ith input sequence

$$T^{(i)}_y$$

The length of the ith output sequence

πŸš€ One Hot Encoding

A way to represent words so we can treat with them easily

πŸ”Ž Example

Let's say that we have a dictionary that consists of 10 words (🀭) and the words of the dictionary are:

  • Car, Pen, Girl, Berry, Apple, Likes, The, And, Boy, Book.

Our $$X^{(i)}$$ is: The Girl Likes Apple And Berry

So we can represent this sequence like the following πŸ‘€

Car   -0)  ⌈ 0 βŒ‰   ⌈ 0 βŒ‰   ⌈ 0 βŒ‰   ⌈ 0 βŒ‰  ⌈ 0 βŒ‰   ⌈ 0 βŒ‰ 
Pen   -1)  | 0 |  | 0 |  | 0 |  | 0 |  | 0 |  | 0 |
Girl  -2)  | 0 |  | 1 |  | 0 |  | 0 |  | 0 |  | 0 |
Berry -3)  | 0 |  | 0 |  | 0 |  | 0 |  | 0 |  | 1 |
Apple -4)  | 0 |  | 0 |  | 0 |  | 1 |  | 0 |  | 0 |
Likes -5)  | 0 |  | 0 |  | 1 |  | 0 |  | 0 |  | 0 |
The   -6)  | 1 |  | 0 |  | 0 |  | 0 |  | 0 |  | 0 |
And   -7)  | 0 |  | 0 |  | 0 |  | 0 |  | 1 |  | 0 |
Boy   -8)  | 0 |  | 0 |  | 0 |  | 0 |  | 0 |  | 0 |
Book  -9)  ⌊ 0 βŒ‹   ⌊ 0 βŒ‹   ⌊ 0 βŒ‹   ⌊ 0 βŒ‹  ⌊ 0 βŒ‹   ⌊ 0 βŒ‹

By representing sequences in this way we can feed out data to neural networks ✨

πŸ™„ Disadvantage

  • If our dictionary consists of 10,000 words so each vector will be 10,000 dimensional πŸ€•

  • This representation can not capture semantic features πŸ’”

PreviousIntroductionNextRecurrent Neural Networks

Last updated 4 years ago

Was this helpful?

➰
πŸ“š