
Advancements in Natural Language Processing and Deep Learning
Explore the latest trends in natural language learning, statistical machine learning, machine translation, deep learning breakthroughs, word embeddings, deep text analysis, and more in the field of Natural Language Processing (NLP). Discover how language models are trained on large document collections, the impact of unlabeled data, and the development of a unified deep learning architecture for NLP tasks like Named Entity Recognition, Part of Speech tagging, and Sentiment Analysis. Dive into deep text analysis techniques such as Word Sense Disambiguation, Information Extraction, and Question Answering, and stay updated on the latest advancements in tackling Alzheimer's disease through innovative question-answering approaches.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Letdellaparola Giuseppe Attardi Dipartimento di Informatica Universit di Pisa ESA SoBigData Pisa, 24 febbraio 2015
NaturalLanguage Learning Children learn to speak naturally, by talking with others Teach computers to learn language in a similarly natural way
Statistical Machine Learning Training on large document collections Requires ability to process Big Data If we used same algorithms 10 years ago they would still be running The Unreasonable Effectiveness of Big Data
Example: Machine Translation Arabic to English, five-gram language models, of varying size
Deep Learning Breakthrough: 2006 Output layer Prediction of target Hidden layers Learn more abstract representations Input layer Raw input
Lots of Unlabeled Data Language Model Corpus: 2 B words Dictionary: 130,000 most frequent words 4 weeks of training Parallel + CUDA algorithm 2 hours
Word Embeddings neighboring words are semantically related
A Unified Deep Learning Architecture for NLP NER (Named Entity Recognition) POS tagging Chunking Parsing SRL (Semantic Role Labeling) Sentiment Analysis
Deep Text Analysis Parsing Word Sense Disambiguation Anafora Resolution Information Extraction Sentiment Analysis Text Entailment Question Answering Biomedical Text Analysis
QA on Alzheimer Disease ROOT SUBJ OBJ APPO OBJ the -secretase inhibitor Semacestat failed to slow cognitive decline disorder protein drug SnowMed: C0236848 substance QA on Alzheimer Competition http://www.windturbinesyndrome.com/wp-content/uploads/2012/09/gold-medal.jpg
Big data, Big Brain Google DistrBelief Cluster capable of simulating 100 billion connections Used to learn unsupervised image classification Used to produce tiny ASR model Similar basic capability for processing image, audio and language European FET Brain project