C5_W3.pdf
Various Sequence to Sequence Architectures
Speech Recognition - Audio Data
Quiz
Programming Assignments
Neural Machine Translation with Attention Models
Here's what you should remember
- Machine translation models can be used to map from one sequence to another. They are useful not just for translating human languages (like French->English) but also for tasks like date format translation.
- An attention mechanism allows a network to focus on the most relevant parts of the input when producing a specific part of the output.
- A network using an attention mechanism can translate from inputs of length $T_x$ to outputs of length $T_y$, where $T_x$ and $T_y$ can be different.
- You can visualize attention weights $\alpha^{\langle t,t' \rangle}$ to see what the network is paying attention to while generating each output.
Trigger Word Detection
Here's what you should remember:
- Data synthesis is an effective way to create a large training set for speech problems, specifically trigger word detection.