
Neural Question Answering and Recursive Neural Networks
Explore the world of deep learning and neural question answering through QANTA assignment, Stanford QA corpus, and more. Learn about building Recursive Neural Networks over dependency trees for advanced text analysis in natural language processing.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Deep Learning Neural Question Answering (including Qanta Assignment)
Quiz Bowl Questions [Iyyer et al. 2014]
Stanford QA corpus [Rajpurkar et al. 2016]
Links about Theano and Qanta http://deeplearning.net/software/theano/tutorial / Tutorial about Theano https://cs.umd.edu/~miyyer/qblearn/ A Neural Network for Factoid Question Answering over Paragraphs Mohit Iyyer, Jordan Boyd-Graber, Leonardo Claudino, Richard Socher, and Hal Daum III EMNLP 2014 https://cs.umd.edu/~miyyer/data/question_data.tar.gz https://cs.umd.edu/~miyyer/qblearn/qanta.tar.gz
FQA as a classification task A predefined set of answers, say 10 answers Given a question, pick the best one out of 10.
QANTA Assignment Get dependency parsing of the question Build Recursive Neural Network over dependency tree Train this Recursive Neural Network over training data Use this Recursive Neural Network as a feature extractor Train a Logistic Regression Classifier over the features from the Recursive Neural Network as our predictor
Dependency-Tree Recursive Neural Networks Unlike constituent parses, words reside at nodes other than leaves Need to combine word vector of node with hidden vectors of children [slides from Chris Hidey]
Dependency-Tree Recursive Neural Networks hhelots= f (Wv xhelots+b) hcalled= f (Wv xcalled+b+WDOBJ hhelots)
Dependency-Tree Recursive Neural Networks = + + + ( ) depended h f W x b W economy h W h v depended NSUBJ PREP on hn= f(Wv xn+ b + WR(n,k) hk k K(n) )
Activation Functions 1 Sigmoid 1+e x 1 e 2x 1+e 2x Tanh (hyperbolic tangent) Rectified linear unit (ReLU) max( , 0 ) x tanh( ) x Normalized Tanh = ( ) f x tanh( ) x
Cost Function s S C(S, ) = max(0,1 xc hs+ xz hs) z Zt xc=correct xz=incorrect 1 N t T C(t, ) min ={Wr R,Wv,We,b)