Recurrent Neural Networks: Text Generation
Master Language Modeling – learn about RNNs and what problems they solve by building your own Movie Reviews Classifier and even write an algorithm that can write code!
One of the most active areas of research in the field of Artificial Intelligence is Language Modeling. From AI assistants like Google Assistant, Siri or Alexa, to machine translation and text generation, the field is vast and always changing.
In terms of architecture, Recurrent Neural Networks (RNNs) have proven to be one of the most successful. This course will help you understand why are RNNs efficient while building two of the most popular types of models – Generative Models and Discriminative Models.
What you’ll learn
- Learn about Embeddings – convert text to something that a computer can “understand” and process – learn about bag of words, one hot encoders, tuning your Vocabulary size – all with comprehensive examples
- Learn what what type of issues and limitations RNNs solve as opposed to traditional ANNs
- RNN Architecture and flavours – different architectures solve different issues
- Use Sampling to generate text – learn how it works
- Learn BPTT – Backpropagation through time – learn about computing weights and loss in a RNN
- Train your RNN using BPTT – write code that generates code
- Understand the problems with Vanilla RNNs – the Vanishing Gradient and Exploding Gradient problems
- What is an Long Short Term Memory Cell (LSTM), its architecture and how it solves the Vanishing Gradient Problem
- Learn about the Gated Recurrent Unit (GRU) – and its pros and cons compared to the LSTM Cell
- Apply what you learned and build a RNN that will classify Movie Reviews for you!
- Build a Text Generator RNN – it will write code for you once you train it on enough data!
RNN are still evolving. This course will give you a good introduction to the field, but remember – the sky’s the limit!
Basic Python Knowledge
Completion of the Artificial Neural Networks Course or similar