Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Deep Learning with Theano

You're reading from   Deep Learning with Theano Perform large-scale numerical and scientific computations efficiently

Arrow left icon
Product type Paperback
Published in Jul 2017
Publisher Packt
ISBN-13 9781786465825
Length 300 pages
Edition 1st Edition
Tools
Arrow right icon
Author (1):
Arrow left icon
 Bourez Bourez
Author Profile Icon Bourez
Bourez
Arrow right icon
View More author details
Toc

Table of Contents (22) Chapters Close

Deep Learning with Theano
Credits
About the Author
Acknowledgments
About the Reviewers
www.PacktPub.com
Customer Feedback
Preface
1. Theano Basics FREE CHAPTER 2. Classifying Handwritten Digits with a Feedforward Network 3. Encoding Word into Vector 4. Generating Text with a Recurrent Neural Net 5. Analyzing Sentiment with a Bidirectional LSTM 6. Locating with Spatial Transformer Networks 7. Classifying Images with Residual Networks 8. Translating and Explaining with Encoding – decoding Networks 9. Selecting Relevant Inputs or Memories with the Mechanism of Attention 10. Predicting Times Sequences with Advanced RNN 11. Learning from the Environment with Reinforcement 12. Learning Features with Unsupervised Generative Networks 13. Extending Deep Learning with Theano Index

Differentiable mechanism of attention


When translating a sentence, describing the content of an image, annotating a sentence, or transcribing an audio, it sounds natural to focus on one part at a time of the input sentence or image, to get the sense of the block and transform it, before moving to the next part, under a certain order for global understanding.

For example, in the German language, under certain conditions, verbs come at the end of the sentence, so, when translating to English, once the subject has been read and translated, a good machine translation neural network could move its focus to the end of the sentence to find the verb and translate it into English. This process of matching input positions to current output predictions is possible through the mechanism of attention.

First, let's come back to classification networks that have been designed with a softmax layer (see Chapter 2, Classifying Handwritten Digits with a Feedforward Network) that outputs a non-negative weight...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at £13.99/month. Cancel anytime
Visually different images