Representation Learning for Natural Language Processing.

Saved in:
Bibliographic Details
:
TeilnehmendeR:
Place / Publishing House:Singapore : : Springer Singapore Pte. Limited,, 2020.
©2020.
Year of Publication:2020
Edition:1st ed.
Language:English
Online Access:
Physical Description:1 online resource (319 pages)
Tags: Add Tag
No Tags, Be the first to tag this record!
id 50030592734
ctrlnum (MiAaPQ)50030592734
(Au-PeEL)EBL30592734
(OCoLC)1176494182
collection bib_alma
record_format marc
spelling Liu, Zhiyuan.
Representation Learning for Natural Language Processing.
1st ed.
Singapore : Springer Singapore Pte. Limited, 2020.
©2020.
1 online resource (319 pages)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
Intro -- Preface -- Acknowledgements -- Contents -- Acronyms -- Symbols and Notations -- 1 Representation Learning and NLP -- 1.1 Motivation -- 1.2 Why Representation Learning Is Important for NLP -- 1.3 Basic Ideas of Representation Learning -- 1.4 Development of Representation Learning for NLP -- 1.5 Learning Approaches to Representation Learning for NLP -- 1.6 Applications of Representation Learning for NLP -- 1.7 The Organization of This Book -- References -- 2 Word Representation -- 2.1 Introduction -- 2.2 One-Hot Word Representation -- 2.3 Distributed Word Representation -- 2.3.1 Brown Cluster -- 2.3.2 Latent Semantic Analysis -- 2.3.3 Word2vec -- 2.3.4 GloVe -- 2.4 Contextualized Word Representation -- 2.5 Extensions -- 2.5.1 Word Representation Theories -- 2.5.2 Multi-prototype Word Representation -- 2.5.3 Multisource Word Representation -- 2.5.4 Multilingual Word Representation -- 2.5.5 Task-Specific Word Representation -- 2.5.6 Time-Specific Word Representation -- 2.6 Evaluation -- 2.6.1 Word Similarity/Relatedness -- 2.6.2 Word Analogy -- 2.7 Summary -- References -- 3 Compositional Semantics -- 3.1 Introduction -- 3.2 Semantic Space -- 3.2.1 Vector Space -- 3.2.2 Matrix-Vector Space -- 3.3 Binary Composition -- 3.3.1 Additive Model -- 3.3.2 Multiplicative Model -- 3.4 N-Ary Composition -- 3.4.1 Recurrent Neural Network -- 3.4.2 Recursive Neural Network -- 3.4.3 Convolutional Neural Network -- 3.5 Summary -- References -- 4 Sentence Representation -- 4.1 Introduction -- 4.2 One-Hot Sentence Representation -- 4.3 Probabilistic Language Model -- 4.4 Neural Language Model -- 4.4.1 Feedforward Neural Network Language Model -- 4.4.2 Convolutional Neural Network Language Model -- 4.4.3 Recurrent Neural Network Language Model -- 4.4.4 Transformer Language Model -- 4.4.5 Extensions -- 4.5 Applications -- 4.5.1 Text Classification.
4.5.2 Relation Extraction -- 4.6 Summary -- References -- 5 RETRACTED CHAPTER: Document Representation -- 6 Sememe Knowledge Representation -- 6.1 Introduction -- 6.1.1 Linguistic Knowledge Graphs -- 6.2 Sememe Knowledge Representation -- 6.2.1 Simple Sememe Aggregation Model -- 6.2.2 Sememe Attention over Context Model -- 6.2.3 Sememe Attention over Target Model -- 6.3 Applications -- 6.3.1 Sememe-Guided Word Representation -- 6.3.2 Sememe-Guided Semantic Compositionality Modeling -- 6.3.3 Sememe-Guided Language Modeling -- 6.3.4 Sememe Prediction -- 6.3.5 Other Sememe-Guided Applications -- 6.4 Summary -- References -- 7 World Knowledge Representation -- 7.1 Introduction -- 7.1.1 World Knowledge Graphs -- 7.2 Knowledge Graph Representation -- 7.2.1 Notations -- 7.2.2 TransE -- 7.2.3 Extensions of TransE -- 7.2.4 Other Models -- 7.3 Multisource Knowledge Graph Representation -- 7.3.1 Knowledge Graph Representation with Texts -- 7.3.2 Knowledge Graph Representation with Types -- 7.3.3 Knowledge Graph Representation with Images -- 7.3.4 Knowledge Graph Representation with Logic Rules -- 7.4 Applications -- 7.4.1 Knowledge Graph Completion -- 7.4.2 Knowledge-Guided Entity Typing -- 7.4.3 Knowledge-Guided Information Retrieval -- 7.4.4 Knowledge-Guided Language Models -- 7.4.5 Other Knowledge-Guided Applications -- 7.5 Summary -- References -- 8 Network Representation -- 8.1 Introduction -- 8.2 Network Representation -- 8.2.1 Spectral Clustering Based Methods -- 8.2.2 DeepWalk -- 8.2.3 Matrix Factorization Based Methods -- 8.2.4 Structural Deep Network Methods -- 8.2.5 Extensions -- 8.2.6 Applications -- 8.3 Graph Neural Networks -- 8.3.1 Motivations -- 8.3.2 Graph Convolutional Networks -- 8.3.3 Graph Attention Networks -- 8.3.4 Graph Recurrent Networks -- 8.3.5 Extensions -- 8.3.6 Applications -- 8.4 Summary -- References.
9 Cross-Modal Representation -- 9.1 Introduction -- 9.2 Cross-Modal Representation -- 9.2.1 Visual Word2vec -- 9.2.2 Cross-Modal Representation for Zero-Shot Recognition -- 9.2.3 Cross-Modal Representation for Cross-Media Retrieval -- 9.3 Image Captioning -- 9.3.1 Retrieval Models for Image Captioning -- 9.3.2 Generation Models for Image Captioning -- 9.3.3 Neural Models for Image Captioning -- 9.4 Visual Relationship Detection -- 9.4.1 Visual Relationship Detection with Language Priors -- 9.4.2 Visual Translation Embedding Network -- 9.4.3 Scene Graph Generation -- 9.5 Visual Question Answering -- 9.5.1 VQA and VQA Datasets -- 9.5.2 VQA Models -- 9.6 Summary -- References -- 10 Resources -- 10.1 Open-Source Frameworks for Deep Learning -- 10.1.1 Caffe -- 10.1.2 Theano -- 10.1.3 TensorFlow -- 10.1.4 Torch -- 10.1.5 PyTorch -- 10.1.6 Keras -- 10.1.7 MXNet -- 10.2 Open Resources for Word Representation -- 10.2.1 Word2Vec -- 10.2.2 GloVe -- 10.3 Open Resources for Knowledge Graph Representation -- 10.3.1 OpenKE -- 10.3.2 Scikit-Kge -- 10.4 Open Resources for Network Representation -- 10.4.1 OpenNE -- 10.4.2 GEM -- 10.4.3 GraphVite -- 10.4.4 CogDL -- 10.5 Open Resources for Relation Extraction -- 10.5.1 OpenNRE -- References -- 11 Outlook -- 11.1 Introduction -- 11.2 Using More Unsupervised Data -- 11.3 Utilizing Fewer Labeled Data -- 11.4 Employing Deeper Neural Architectures -- 11.5 Improving Model Interpretability -- 11.6 Fusing the Advances from Other Areas -- References -- Correction to: Z. Liu et al., Representation Learning for Natural Language Processing, https://doi.org/10.1007/978-981-15-5573-2.
Description based on publisher supplied metadata and other sources.
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic books.
Lin, Yankai.
Sun, Maosong.
Print version: Liu, Zhiyuan Representation Learning for Natural Language Processing Singapore : Springer Singapore Pte. Limited,c2020 9789811555725
ProQuest (Firm)
https://ebookcentral.proquest.com/lib/oeawat/detail.action?docID=30592734 Click to View
language English
format eBook
author Liu, Zhiyuan.
spellingShingle Liu, Zhiyuan.
Representation Learning for Natural Language Processing.
Intro -- Preface -- Acknowledgements -- Contents -- Acronyms -- Symbols and Notations -- 1 Representation Learning and NLP -- 1.1 Motivation -- 1.2 Why Representation Learning Is Important for NLP -- 1.3 Basic Ideas of Representation Learning -- 1.4 Development of Representation Learning for NLP -- 1.5 Learning Approaches to Representation Learning for NLP -- 1.6 Applications of Representation Learning for NLP -- 1.7 The Organization of This Book -- References -- 2 Word Representation -- 2.1 Introduction -- 2.2 One-Hot Word Representation -- 2.3 Distributed Word Representation -- 2.3.1 Brown Cluster -- 2.3.2 Latent Semantic Analysis -- 2.3.3 Word2vec -- 2.3.4 GloVe -- 2.4 Contextualized Word Representation -- 2.5 Extensions -- 2.5.1 Word Representation Theories -- 2.5.2 Multi-prototype Word Representation -- 2.5.3 Multisource Word Representation -- 2.5.4 Multilingual Word Representation -- 2.5.5 Task-Specific Word Representation -- 2.5.6 Time-Specific Word Representation -- 2.6 Evaluation -- 2.6.1 Word Similarity/Relatedness -- 2.6.2 Word Analogy -- 2.7 Summary -- References -- 3 Compositional Semantics -- 3.1 Introduction -- 3.2 Semantic Space -- 3.2.1 Vector Space -- 3.2.2 Matrix-Vector Space -- 3.3 Binary Composition -- 3.3.1 Additive Model -- 3.3.2 Multiplicative Model -- 3.4 N-Ary Composition -- 3.4.1 Recurrent Neural Network -- 3.4.2 Recursive Neural Network -- 3.4.3 Convolutional Neural Network -- 3.5 Summary -- References -- 4 Sentence Representation -- 4.1 Introduction -- 4.2 One-Hot Sentence Representation -- 4.3 Probabilistic Language Model -- 4.4 Neural Language Model -- 4.4.1 Feedforward Neural Network Language Model -- 4.4.2 Convolutional Neural Network Language Model -- 4.4.3 Recurrent Neural Network Language Model -- 4.4.4 Transformer Language Model -- 4.4.5 Extensions -- 4.5 Applications -- 4.5.1 Text Classification.
4.5.2 Relation Extraction -- 4.6 Summary -- References -- 5 RETRACTED CHAPTER: Document Representation -- 6 Sememe Knowledge Representation -- 6.1 Introduction -- 6.1.1 Linguistic Knowledge Graphs -- 6.2 Sememe Knowledge Representation -- 6.2.1 Simple Sememe Aggregation Model -- 6.2.2 Sememe Attention over Context Model -- 6.2.3 Sememe Attention over Target Model -- 6.3 Applications -- 6.3.1 Sememe-Guided Word Representation -- 6.3.2 Sememe-Guided Semantic Compositionality Modeling -- 6.3.3 Sememe-Guided Language Modeling -- 6.3.4 Sememe Prediction -- 6.3.5 Other Sememe-Guided Applications -- 6.4 Summary -- References -- 7 World Knowledge Representation -- 7.1 Introduction -- 7.1.1 World Knowledge Graphs -- 7.2 Knowledge Graph Representation -- 7.2.1 Notations -- 7.2.2 TransE -- 7.2.3 Extensions of TransE -- 7.2.4 Other Models -- 7.3 Multisource Knowledge Graph Representation -- 7.3.1 Knowledge Graph Representation with Texts -- 7.3.2 Knowledge Graph Representation with Types -- 7.3.3 Knowledge Graph Representation with Images -- 7.3.4 Knowledge Graph Representation with Logic Rules -- 7.4 Applications -- 7.4.1 Knowledge Graph Completion -- 7.4.2 Knowledge-Guided Entity Typing -- 7.4.3 Knowledge-Guided Information Retrieval -- 7.4.4 Knowledge-Guided Language Models -- 7.4.5 Other Knowledge-Guided Applications -- 7.5 Summary -- References -- 8 Network Representation -- 8.1 Introduction -- 8.2 Network Representation -- 8.2.1 Spectral Clustering Based Methods -- 8.2.2 DeepWalk -- 8.2.3 Matrix Factorization Based Methods -- 8.2.4 Structural Deep Network Methods -- 8.2.5 Extensions -- 8.2.6 Applications -- 8.3 Graph Neural Networks -- 8.3.1 Motivations -- 8.3.2 Graph Convolutional Networks -- 8.3.3 Graph Attention Networks -- 8.3.4 Graph Recurrent Networks -- 8.3.5 Extensions -- 8.3.6 Applications -- 8.4 Summary -- References.
9 Cross-Modal Representation -- 9.1 Introduction -- 9.2 Cross-Modal Representation -- 9.2.1 Visual Word2vec -- 9.2.2 Cross-Modal Representation for Zero-Shot Recognition -- 9.2.3 Cross-Modal Representation for Cross-Media Retrieval -- 9.3 Image Captioning -- 9.3.1 Retrieval Models for Image Captioning -- 9.3.2 Generation Models for Image Captioning -- 9.3.3 Neural Models for Image Captioning -- 9.4 Visual Relationship Detection -- 9.4.1 Visual Relationship Detection with Language Priors -- 9.4.2 Visual Translation Embedding Network -- 9.4.3 Scene Graph Generation -- 9.5 Visual Question Answering -- 9.5.1 VQA and VQA Datasets -- 9.5.2 VQA Models -- 9.6 Summary -- References -- 10 Resources -- 10.1 Open-Source Frameworks for Deep Learning -- 10.1.1 Caffe -- 10.1.2 Theano -- 10.1.3 TensorFlow -- 10.1.4 Torch -- 10.1.5 PyTorch -- 10.1.6 Keras -- 10.1.7 MXNet -- 10.2 Open Resources for Word Representation -- 10.2.1 Word2Vec -- 10.2.2 GloVe -- 10.3 Open Resources for Knowledge Graph Representation -- 10.3.1 OpenKE -- 10.3.2 Scikit-Kge -- 10.4 Open Resources for Network Representation -- 10.4.1 OpenNE -- 10.4.2 GEM -- 10.4.3 GraphVite -- 10.4.4 CogDL -- 10.5 Open Resources for Relation Extraction -- 10.5.1 OpenNRE -- References -- 11 Outlook -- 11.1 Introduction -- 11.2 Using More Unsupervised Data -- 11.3 Utilizing Fewer Labeled Data -- 11.4 Employing Deeper Neural Architectures -- 11.5 Improving Model Interpretability -- 11.6 Fusing the Advances from Other Areas -- References -- Correction to: Z. Liu et al., Representation Learning for Natural Language Processing, https://doi.org/10.1007/978-981-15-5573-2.
author_facet Liu, Zhiyuan.
Lin, Yankai.
Sun, Maosong.
author_variant z l zl
author2 Lin, Yankai.
Sun, Maosong.
author2_variant y l yl
m s ms
author2_role TeilnehmendeR
TeilnehmendeR
author_sort Liu, Zhiyuan.
title Representation Learning for Natural Language Processing.
title_full Representation Learning for Natural Language Processing.
title_fullStr Representation Learning for Natural Language Processing.
title_full_unstemmed Representation Learning for Natural Language Processing.
title_auth Representation Learning for Natural Language Processing.
title_new Representation Learning for Natural Language Processing.
title_sort representation learning for natural language processing.
publisher Springer Singapore Pte. Limited,
publishDate 2020
physical 1 online resource (319 pages)
edition 1st ed.
contents Intro -- Preface -- Acknowledgements -- Contents -- Acronyms -- Symbols and Notations -- 1 Representation Learning and NLP -- 1.1 Motivation -- 1.2 Why Representation Learning Is Important for NLP -- 1.3 Basic Ideas of Representation Learning -- 1.4 Development of Representation Learning for NLP -- 1.5 Learning Approaches to Representation Learning for NLP -- 1.6 Applications of Representation Learning for NLP -- 1.7 The Organization of This Book -- References -- 2 Word Representation -- 2.1 Introduction -- 2.2 One-Hot Word Representation -- 2.3 Distributed Word Representation -- 2.3.1 Brown Cluster -- 2.3.2 Latent Semantic Analysis -- 2.3.3 Word2vec -- 2.3.4 GloVe -- 2.4 Contextualized Word Representation -- 2.5 Extensions -- 2.5.1 Word Representation Theories -- 2.5.2 Multi-prototype Word Representation -- 2.5.3 Multisource Word Representation -- 2.5.4 Multilingual Word Representation -- 2.5.5 Task-Specific Word Representation -- 2.5.6 Time-Specific Word Representation -- 2.6 Evaluation -- 2.6.1 Word Similarity/Relatedness -- 2.6.2 Word Analogy -- 2.7 Summary -- References -- 3 Compositional Semantics -- 3.1 Introduction -- 3.2 Semantic Space -- 3.2.1 Vector Space -- 3.2.2 Matrix-Vector Space -- 3.3 Binary Composition -- 3.3.1 Additive Model -- 3.3.2 Multiplicative Model -- 3.4 N-Ary Composition -- 3.4.1 Recurrent Neural Network -- 3.4.2 Recursive Neural Network -- 3.4.3 Convolutional Neural Network -- 3.5 Summary -- References -- 4 Sentence Representation -- 4.1 Introduction -- 4.2 One-Hot Sentence Representation -- 4.3 Probabilistic Language Model -- 4.4 Neural Language Model -- 4.4.1 Feedforward Neural Network Language Model -- 4.4.2 Convolutional Neural Network Language Model -- 4.4.3 Recurrent Neural Network Language Model -- 4.4.4 Transformer Language Model -- 4.4.5 Extensions -- 4.5 Applications -- 4.5.1 Text Classification.
4.5.2 Relation Extraction -- 4.6 Summary -- References -- 5 RETRACTED CHAPTER: Document Representation -- 6 Sememe Knowledge Representation -- 6.1 Introduction -- 6.1.1 Linguistic Knowledge Graphs -- 6.2 Sememe Knowledge Representation -- 6.2.1 Simple Sememe Aggregation Model -- 6.2.2 Sememe Attention over Context Model -- 6.2.3 Sememe Attention over Target Model -- 6.3 Applications -- 6.3.1 Sememe-Guided Word Representation -- 6.3.2 Sememe-Guided Semantic Compositionality Modeling -- 6.3.3 Sememe-Guided Language Modeling -- 6.3.4 Sememe Prediction -- 6.3.5 Other Sememe-Guided Applications -- 6.4 Summary -- References -- 7 World Knowledge Representation -- 7.1 Introduction -- 7.1.1 World Knowledge Graphs -- 7.2 Knowledge Graph Representation -- 7.2.1 Notations -- 7.2.2 TransE -- 7.2.3 Extensions of TransE -- 7.2.4 Other Models -- 7.3 Multisource Knowledge Graph Representation -- 7.3.1 Knowledge Graph Representation with Texts -- 7.3.2 Knowledge Graph Representation with Types -- 7.3.3 Knowledge Graph Representation with Images -- 7.3.4 Knowledge Graph Representation with Logic Rules -- 7.4 Applications -- 7.4.1 Knowledge Graph Completion -- 7.4.2 Knowledge-Guided Entity Typing -- 7.4.3 Knowledge-Guided Information Retrieval -- 7.4.4 Knowledge-Guided Language Models -- 7.4.5 Other Knowledge-Guided Applications -- 7.5 Summary -- References -- 8 Network Representation -- 8.1 Introduction -- 8.2 Network Representation -- 8.2.1 Spectral Clustering Based Methods -- 8.2.2 DeepWalk -- 8.2.3 Matrix Factorization Based Methods -- 8.2.4 Structural Deep Network Methods -- 8.2.5 Extensions -- 8.2.6 Applications -- 8.3 Graph Neural Networks -- 8.3.1 Motivations -- 8.3.2 Graph Convolutional Networks -- 8.3.3 Graph Attention Networks -- 8.3.4 Graph Recurrent Networks -- 8.3.5 Extensions -- 8.3.6 Applications -- 8.4 Summary -- References.
9 Cross-Modal Representation -- 9.1 Introduction -- 9.2 Cross-Modal Representation -- 9.2.1 Visual Word2vec -- 9.2.2 Cross-Modal Representation for Zero-Shot Recognition -- 9.2.3 Cross-Modal Representation for Cross-Media Retrieval -- 9.3 Image Captioning -- 9.3.1 Retrieval Models for Image Captioning -- 9.3.2 Generation Models for Image Captioning -- 9.3.3 Neural Models for Image Captioning -- 9.4 Visual Relationship Detection -- 9.4.1 Visual Relationship Detection with Language Priors -- 9.4.2 Visual Translation Embedding Network -- 9.4.3 Scene Graph Generation -- 9.5 Visual Question Answering -- 9.5.1 VQA and VQA Datasets -- 9.5.2 VQA Models -- 9.6 Summary -- References -- 10 Resources -- 10.1 Open-Source Frameworks for Deep Learning -- 10.1.1 Caffe -- 10.1.2 Theano -- 10.1.3 TensorFlow -- 10.1.4 Torch -- 10.1.5 PyTorch -- 10.1.6 Keras -- 10.1.7 MXNet -- 10.2 Open Resources for Word Representation -- 10.2.1 Word2Vec -- 10.2.2 GloVe -- 10.3 Open Resources for Knowledge Graph Representation -- 10.3.1 OpenKE -- 10.3.2 Scikit-Kge -- 10.4 Open Resources for Network Representation -- 10.4.1 OpenNE -- 10.4.2 GEM -- 10.4.3 GraphVite -- 10.4.4 CogDL -- 10.5 Open Resources for Relation Extraction -- 10.5.1 OpenNRE -- References -- 11 Outlook -- 11.1 Introduction -- 11.2 Using More Unsupervised Data -- 11.3 Utilizing Fewer Labeled Data -- 11.4 Employing Deeper Neural Architectures -- 11.5 Improving Model Interpretability -- 11.6 Fusing the Advances from Other Areas -- References -- Correction to: Z. Liu et al., Representation Learning for Natural Language Processing, https://doi.org/10.1007/978-981-15-5573-2.
isbn 9789811555732
9789811555725
callnumber-first Q - Science
callnumber-subject QA - Mathematics
callnumber-label QA76
callnumber-sort QA 276.9 N38
genre Electronic books.
genre_facet Electronic books.
url https://ebookcentral.proquest.com/lib/oeawat/detail.action?docID=30592734
illustrated Not Illustrated
oclc_num 1176494182
work_keys_str_mv AT liuzhiyuan representationlearningfornaturallanguageprocessing
AT linyankai representationlearningfornaturallanguageprocessing
AT sunmaosong representationlearningfornaturallanguageprocessing
status_str n
ids_txt_mv (MiAaPQ)50030592734
(Au-PeEL)EBL30592734
(OCoLC)1176494182
carrierType_str_mv cr
is_hierarchy_title Representation Learning for Natural Language Processing.
author2_original_writing_str_mv noLinkedField
noLinkedField
marc_error Info : MARC8 translation shorter than ISO-8859-1, choosing MARC8. --- [ 856 : z ]
_version_ 1792331070746656768
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>06942nam a22004213i 4500</leader><controlfield tag="001">50030592734</controlfield><controlfield tag="003">MiAaPQ</controlfield><controlfield tag="005">20240229073850.0</controlfield><controlfield tag="006">m o d | </controlfield><controlfield tag="007">cr cnu||||||||</controlfield><controlfield tag="008">240229s2020 xx o ||||0 eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9789811555732</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9789811555725</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(MiAaPQ)50030592734</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(Au-PeEL)EBL30592734</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1176494182</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">MiAaPQ</subfield><subfield code="b">eng</subfield><subfield code="e">rda</subfield><subfield code="e">pn</subfield><subfield code="c">MiAaPQ</subfield><subfield code="d">MiAaPQ</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA76.9.N38</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Liu, Zhiyuan.</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Representation Learning for Natural Language Processing.</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">1st ed.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Singapore :</subfield><subfield code="b">Springer Singapore Pte. Limited,</subfield><subfield code="c">2020.</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">©2020.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (319 pages)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Intro -- Preface -- Acknowledgements -- Contents -- Acronyms -- Symbols and Notations -- 1 Representation Learning and NLP -- 1.1 Motivation -- 1.2 Why Representation Learning Is Important for NLP -- 1.3 Basic Ideas of Representation Learning -- 1.4 Development of Representation Learning for NLP -- 1.5 Learning Approaches to Representation Learning for NLP -- 1.6 Applications of Representation Learning for NLP -- 1.7 The Organization of This Book -- References -- 2 Word Representation -- 2.1 Introduction -- 2.2 One-Hot Word Representation -- 2.3 Distributed Word Representation -- 2.3.1 Brown Cluster -- 2.3.2 Latent Semantic Analysis -- 2.3.3 Word2vec -- 2.3.4 GloVe -- 2.4 Contextualized Word Representation -- 2.5 Extensions -- 2.5.1 Word Representation Theories -- 2.5.2 Multi-prototype Word Representation -- 2.5.3 Multisource Word Representation -- 2.5.4 Multilingual Word Representation -- 2.5.5 Task-Specific Word Representation -- 2.5.6 Time-Specific Word Representation -- 2.6 Evaluation -- 2.6.1 Word Similarity/Relatedness -- 2.6.2 Word Analogy -- 2.7 Summary -- References -- 3 Compositional Semantics -- 3.1 Introduction -- 3.2 Semantic Space -- 3.2.1 Vector Space -- 3.2.2 Matrix-Vector Space -- 3.3 Binary Composition -- 3.3.1 Additive Model -- 3.3.2 Multiplicative Model -- 3.4 N-Ary Composition -- 3.4.1 Recurrent Neural Network -- 3.4.2 Recursive Neural Network -- 3.4.3 Convolutional Neural Network -- 3.5 Summary -- References -- 4 Sentence Representation -- 4.1 Introduction -- 4.2 One-Hot Sentence Representation -- 4.3 Probabilistic Language Model -- 4.4 Neural Language Model -- 4.4.1 Feedforward Neural Network Language Model -- 4.4.2 Convolutional Neural Network Language Model -- 4.4.3 Recurrent Neural Network Language Model -- 4.4.4 Transformer Language Model -- 4.4.5 Extensions -- 4.5 Applications -- 4.5.1 Text Classification.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">4.5.2 Relation Extraction -- 4.6 Summary -- References -- 5 RETRACTED CHAPTER: Document Representation -- 6 Sememe Knowledge Representation -- 6.1 Introduction -- 6.1.1 Linguistic Knowledge Graphs -- 6.2 Sememe Knowledge Representation -- 6.2.1 Simple Sememe Aggregation Model -- 6.2.2 Sememe Attention over Context Model -- 6.2.3 Sememe Attention over Target Model -- 6.3 Applications -- 6.3.1 Sememe-Guided Word Representation -- 6.3.2 Sememe-Guided Semantic Compositionality Modeling -- 6.3.3 Sememe-Guided Language Modeling -- 6.3.4 Sememe Prediction -- 6.3.5 Other Sememe-Guided Applications -- 6.4 Summary -- References -- 7 World Knowledge Representation -- 7.1 Introduction -- 7.1.1 World Knowledge Graphs -- 7.2 Knowledge Graph Representation -- 7.2.1 Notations -- 7.2.2 TransE -- 7.2.3 Extensions of TransE -- 7.2.4 Other Models -- 7.3 Multisource Knowledge Graph Representation -- 7.3.1 Knowledge Graph Representation with Texts -- 7.3.2 Knowledge Graph Representation with Types -- 7.3.3 Knowledge Graph Representation with Images -- 7.3.4 Knowledge Graph Representation with Logic Rules -- 7.4 Applications -- 7.4.1 Knowledge Graph Completion -- 7.4.2 Knowledge-Guided Entity Typing -- 7.4.3 Knowledge-Guided Information Retrieval -- 7.4.4 Knowledge-Guided Language Models -- 7.4.5 Other Knowledge-Guided Applications -- 7.5 Summary -- References -- 8 Network Representation -- 8.1 Introduction -- 8.2 Network Representation -- 8.2.1 Spectral Clustering Based Methods -- 8.2.2 DeepWalk -- 8.2.3 Matrix Factorization Based Methods -- 8.2.4 Structural Deep Network Methods -- 8.2.5 Extensions -- 8.2.6 Applications -- 8.3 Graph Neural Networks -- 8.3.1 Motivations -- 8.3.2 Graph Convolutional Networks -- 8.3.3 Graph Attention Networks -- 8.3.4 Graph Recurrent Networks -- 8.3.5 Extensions -- 8.3.6 Applications -- 8.4 Summary -- References.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">9 Cross-Modal Representation -- 9.1 Introduction -- 9.2 Cross-Modal Representation -- 9.2.1 Visual Word2vec -- 9.2.2 Cross-Modal Representation for Zero-Shot Recognition -- 9.2.3 Cross-Modal Representation for Cross-Media Retrieval -- 9.3 Image Captioning -- 9.3.1 Retrieval Models for Image Captioning -- 9.3.2 Generation Models for Image Captioning -- 9.3.3 Neural Models for Image Captioning -- 9.4 Visual Relationship Detection -- 9.4.1 Visual Relationship Detection with Language Priors -- 9.4.2 Visual Translation Embedding Network -- 9.4.3 Scene Graph Generation -- 9.5 Visual Question Answering -- 9.5.1 VQA and VQA Datasets -- 9.5.2 VQA Models -- 9.6 Summary -- References -- 10 Resources -- 10.1 Open-Source Frameworks for Deep Learning -- 10.1.1 Caffe -- 10.1.2 Theano -- 10.1.3 TensorFlow -- 10.1.4 Torch -- 10.1.5 PyTorch -- 10.1.6 Keras -- 10.1.7 MXNet -- 10.2 Open Resources for Word Representation -- 10.2.1 Word2Vec -- 10.2.2 GloVe -- 10.3 Open Resources for Knowledge Graph Representation -- 10.3.1 OpenKE -- 10.3.2 Scikit-Kge -- 10.4 Open Resources for Network Representation -- 10.4.1 OpenNE -- 10.4.2 GEM -- 10.4.3 GraphVite -- 10.4.4 CogDL -- 10.5 Open Resources for Relation Extraction -- 10.5.1 OpenNRE -- References -- 11 Outlook -- 11.1 Introduction -- 11.2 Using More Unsupervised Data -- 11.3 Utilizing Fewer Labeled Data -- 11.4 Employing Deeper Neural Architectures -- 11.5 Improving Model Interpretability -- 11.6 Fusing the Advances from Other Areas -- References -- Correction to: Z. Liu et al., Representation Learning for Natural Language Processing, https://doi.org/10.1007/978-981-15-5573-2.</subfield></datafield><datafield tag="588" ind1=" " ind2=" "><subfield code="a">Description based on publisher supplied metadata and other sources.</subfield></datafield><datafield tag="590" ind1=" " ind2=" "><subfield code="a">Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries. </subfield></datafield><datafield tag="655" ind1=" " ind2="4"><subfield code="a">Electronic books.</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Lin, Yankai.</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Sun, Maosong.</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="a">Liu, Zhiyuan</subfield><subfield code="t">Representation Learning for Natural Language Processing</subfield><subfield code="d">Singapore : Springer Singapore Pte. Limited,c2020</subfield><subfield code="z">9789811555725</subfield></datafield><datafield tag="797" ind1="2" ind2=" "><subfield code="a">ProQuest (Firm)</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ebookcentral.proquest.com/lib/oeawat/detail.action?docID=30592734</subfield><subfield code="z">Click to View</subfield></datafield></record></collection>