Foundation Models for Natural Language Processing : Pre-trained Language Models Integrating Media / / by Gerhard Paass, Sven Giesselbach.

This open access book provides a comprehensive overview of the state of the art in research and applications of Foundation Models and is intended for readers familiar with basic Natural Language Processing (NLP) concepts. Over the recent years, a revolutionary new paradigm has been developed for tra...

Full description

Saved in:
Bibliographic Details
Superior document:Artificial Intelligence: Foundations, Theory, and Algorithms,
VerfasserIn:
TeilnehmendeR:
Place / Publishing House:Cham : : Springer International Publishing :, Imprint: Springer,, 2023.
Year of Publication:2023
Edition:1st ed. 2023.
Language:English
Series:Artificial intelligence (Berlin, Germany)
Physical Description:1 online resource (xviii, 448 pages)
Tags: Add Tag
No Tags, Be the first to tag this record!
id 993615379504498
ctrlnum (CKB)5580000000544268
(MiAaPQ)EBC30550689
(Au-PeEL)EBL30550689
(DE-He213)978-3-031-23190-2
(PPN)270615679
(OCoLC)1380847755
(EXLCZ)995580000000544268
collection bib_alma
record_format marc
spelling Paass, Gerhard, author.
Foundation Models for Natural Language Processing [electronic resource] : Pre-trained Language Models Integrating Media / by Gerhard Paass, Sven Giesselbach.
1st ed. 2023.
Cham : Springer International Publishing : Imprint: Springer, 2023.
1 online resource (xviii, 448 pages)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
Artificial Intelligence: Foundations, Theory, and Algorithms, 2365-306X
Includes bibliographical references and index.
1. Introduction -- 2. Pre-trained Language Models -- 3. Improving Pre-trained Language Models -- 4. Knowledge Acquired by Foundation Models -- 5. Foundation Models for Information Extraction -- 6. Foundation Models for Text Generation -- 7. Foundation Models for Speech, Images, Videos, and Control -- 8. Summary and Outlook.
This open access book provides a comprehensive overview of the state of the art in research and applications of Foundation Models and is intended for readers familiar with basic Natural Language Processing (NLP) concepts. Over the recent years, a revolutionary new paradigm has been developed for training models for NLP. These models are first pre-trained on large collections of text documents to acquire general syntactic knowledge and semantic information. Then, they are fine-tuned for specific tasks, which they can often solve with superhuman accuracy. When the models are large enough, they can be instructed by prompts to solve new tasks without any fine-tuning. Moreover, they can be applied to a wide range of different media and problem domains, ranging from image and video processing to robot control learning. Because they provide a blueprint for solving many tasks in artificial intelligence, they have been called Foundation Models. After a brief introduction to basic NLP models the main pre-trained language models BERT, GPT and sequence-to-sequence transformer are described, as well as the concepts of self-attention and context-sensitive embedding. Then, different approaches to improving these models are discussed, such as expanding the pre-training criteria, increasing the length of input texts, or including extra knowledge. An overview of the best-performing models for about twenty application areas is then presented, e.g., question answering, translation, story generation, dialog systems, generating images from text, etc. For each application area, the strengths and weaknesses of current models are discussed, and an outlook on further developments is given. In addition, links are provided to freely available program code. A concluding chapter summarizes the economic opportunities, mitigation of risks, and potential developments of AI.
Open Access
Natural language processing (Computer science).
Computational linguistics.
Artificial intelligence.
Expert systems (Computer science).
Machine learning.
Natural Language Processing (NLP).
Computational Linguistics.
Artificial Intelligence.
Knowledge Based Systems.
Machine Learning.
3-031-23189-9
Giesselbach, Sven, author.
Artificial intelligence (Berlin, Germany)
language English
format Electronic
eBook
author Paass, Gerhard,
Giesselbach, Sven,
spellingShingle Paass, Gerhard,
Giesselbach, Sven,
Foundation Models for Natural Language Processing Pre-trained Language Models Integrating Media /
Artificial Intelligence: Foundations, Theory, and Algorithms,
1. Introduction -- 2. Pre-trained Language Models -- 3. Improving Pre-trained Language Models -- 4. Knowledge Acquired by Foundation Models -- 5. Foundation Models for Information Extraction -- 6. Foundation Models for Text Generation -- 7. Foundation Models for Speech, Images, Videos, and Control -- 8. Summary and Outlook.
author_facet Paass, Gerhard,
Giesselbach, Sven,
Giesselbach, Sven,
author_variant g p gp
s g sg
author_role VerfasserIn
VerfasserIn
author2 Giesselbach, Sven,
author2_role TeilnehmendeR
author_sort Paass, Gerhard,
title Foundation Models for Natural Language Processing Pre-trained Language Models Integrating Media /
title_sub Pre-trained Language Models Integrating Media /
title_full Foundation Models for Natural Language Processing [electronic resource] : Pre-trained Language Models Integrating Media / by Gerhard Paass, Sven Giesselbach.
title_fullStr Foundation Models for Natural Language Processing [electronic resource] : Pre-trained Language Models Integrating Media / by Gerhard Paass, Sven Giesselbach.
title_full_unstemmed Foundation Models for Natural Language Processing [electronic resource] : Pre-trained Language Models Integrating Media / by Gerhard Paass, Sven Giesselbach.
title_auth Foundation Models for Natural Language Processing Pre-trained Language Models Integrating Media /
title_new Foundation Models for Natural Language Processing
title_sort foundation models for natural language processing pre-trained language models integrating media /
series Artificial Intelligence: Foundations, Theory, and Algorithms,
series2 Artificial Intelligence: Foundations, Theory, and Algorithms,
publisher Springer International Publishing : Imprint: Springer,
publishDate 2023
physical 1 online resource (xviii, 448 pages)
edition 1st ed. 2023.
contents 1. Introduction -- 2. Pre-trained Language Models -- 3. Improving Pre-trained Language Models -- 4. Knowledge Acquired by Foundation Models -- 5. Foundation Models for Information Extraction -- 6. Foundation Models for Text Generation -- 7. Foundation Models for Speech, Images, Videos, and Control -- 8. Summary and Outlook.
isbn 9783-031-23190-2
3-031-23189-9
issn 2365-306X
callnumber-first Q - Science
callnumber-subject QA - Mathematics
callnumber-label QA76
callnumber-sort QA 276.9 N38
illustrated Not Illustrated
dewey-hundreds 000 - Computer science, information & general works
dewey-tens 000 - Computer science, knowledge & systems
dewey-ones 006 - Special computer methods
dewey-full 006.35
dewey-sort 16.35
dewey-raw 006.35
dewey-search 006.35
oclc_num 1380847755
work_keys_str_mv AT paassgerhard foundationmodelsfornaturallanguageprocessingpretrainedlanguagemodelsintegratingmedia
AT giesselbachsven foundationmodelsfornaturallanguageprocessingpretrainedlanguagemodelsintegratingmedia
status_str n
ids_txt_mv (CKB)5580000000544268
(MiAaPQ)EBC30550689
(Au-PeEL)EBL30550689
(DE-He213)978-3-031-23190-2
(PPN)270615679
(OCoLC)1380847755
(EXLCZ)995580000000544268
carrierType_str_mv cr
hierarchy_parent_title Artificial Intelligence: Foundations, Theory, and Algorithms,
is_hierarchy_title Foundation Models for Natural Language Processing Pre-trained Language Models Integrating Media /
container_title Artificial Intelligence: Foundations, Theory, and Algorithms,
author2_original_writing_str_mv noLinkedField
_version_ 1796653315845521408
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>04173nam a22005535i 4500</leader><controlfield tag="001">993615379504498</controlfield><controlfield tag="005">20231003183650.0</controlfield><controlfield tag="006">m o d | </controlfield><controlfield tag="007">cr#cnu||||||||</controlfield><controlfield tag="008">230523s2023 sz | o |||| 0|eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9783-031-23190-2</subfield><subfield code="q">(electronic book)</subfield></datafield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/978-3-031-23190-2</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(CKB)5580000000544268</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(MiAaPQ)EBC30550689</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(Au-PeEL)EBL30550689</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)978-3-031-23190-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(PPN)270615679</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1380847755</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EXLCZ)995580000000544268</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">MiAaPQ</subfield><subfield code="b">eng</subfield><subfield code="e">rda</subfield><subfield code="c">MiAaPQ</subfield><subfield code="d">OD$</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA76.9.N38</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">UYQL</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">COM073000</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">UYQL</subfield><subfield code="2">thema</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">006.35</subfield><subfield code="2">23</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Paass, Gerhard,</subfield><subfield code="e">author.</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Foundation Models for Natural Language Processing</subfield><subfield code="h">[electronic resource] :</subfield><subfield code="b">Pre-trained Language Models Integrating Media /</subfield><subfield code="c">by Gerhard Paass, Sven Giesselbach.</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">1st ed. 2023.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cham :</subfield><subfield code="b">Springer International Publishing :</subfield><subfield code="b">Imprint: Springer,</subfield><subfield code="c">2023.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (xviii, 448 pages)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="1" ind2=" "><subfield code="a">Artificial Intelligence: Foundations, Theory, and Algorithms,</subfield><subfield code="x">2365-306X</subfield></datafield><datafield tag="504" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references and index.</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">1. Introduction -- 2. Pre-trained Language Models -- 3. Improving Pre-trained Language Models -- 4. Knowledge Acquired by Foundation Models -- 5. Foundation Models for Information Extraction -- 6. Foundation Models for Text Generation -- 7. Foundation Models for Speech, Images, Videos, and Control -- 8. Summary and Outlook.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This open access book provides a comprehensive overview of the state of the art in research and applications of Foundation Models and is intended for readers familiar with basic Natural Language Processing (NLP) concepts. Over the recent years, a revolutionary new paradigm has been developed for training models for NLP. These models are first pre-trained on large collections of text documents to acquire general syntactic knowledge and semantic information. Then, they are fine-tuned for specific tasks, which they can often solve with superhuman accuracy. When the models are large enough, they can be instructed by prompts to solve new tasks without any fine-tuning. Moreover, they can be applied to a wide range of different media and problem domains, ranging from image and video processing to robot control learning. Because they provide a blueprint for solving many tasks in artificial intelligence, they have been called Foundation Models. After a brief introduction to basic NLP models the main pre-trained language models BERT, GPT and sequence-to-sequence transformer are described, as well as the concepts of self-attention and context-sensitive embedding. Then, different approaches to improving these models are discussed, such as expanding the pre-training criteria, increasing the length of input texts, or including extra knowledge. An overview of the best-performing models for about twenty application areas is then presented, e.g., question answering, translation, story generation, dialog systems, generating images from text, etc. For each application area, the strengths and weaknesses of current models are discussed, and an outlook on further developments is given. In addition, links are provided to freely available program code. A concluding chapter summarizes the economic opportunities, mitigation of risks, and potential developments of AI.</subfield></datafield><datafield tag="506" ind1="0" ind2=" "><subfield code="a">Open Access</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Natural language processing (Computer science).</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Computational linguistics.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Artificial intelligence.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Expert systems (Computer science).</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Machine learning.</subfield></datafield><datafield tag="650" ind1="1" ind2="4"><subfield code="a">Natural Language Processing (NLP).</subfield></datafield><datafield tag="650" ind1="2" ind2="4"><subfield code="a">Computational Linguistics.</subfield></datafield><datafield tag="650" ind1="2" ind2="4"><subfield code="a">Artificial Intelligence.</subfield></datafield><datafield tag="650" ind1="2" ind2="4"><subfield code="a">Knowledge Based Systems.</subfield></datafield><datafield tag="650" ind1="2" ind2="4"><subfield code="a">Machine Learning.</subfield></datafield><datafield tag="776" ind1="0" ind2=" "><subfield code="z">3-031-23189-9</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Giesselbach, Sven,</subfield><subfield code="e">author.</subfield></datafield><datafield tag="830" ind1=" " ind2="0"><subfield code="a">Artificial intelligence (Berlin, Germany)</subfield></datafield><datafield tag="ADM" ind1=" " ind2=" "><subfield code="b">2024-02-23 22:38:54 Europe/Vienna</subfield><subfield code="d">00</subfield><subfield code="f">system</subfield><subfield code="c">marc21</subfield><subfield code="a">2023-06-08 13:07:16 Europe/Vienna</subfield><subfield code="g">false</subfield></datafield><datafield tag="AVE" ind1=" " ind2=" "><subfield code="i">DOAB Directory of Open Access Books</subfield><subfield code="P">DOAB Directory of Open Access Books</subfield><subfield code="x">https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&amp;portfolio_pid=5347753440004498&amp;Force_direct=true</subfield><subfield code="Z">5347753440004498</subfield><subfield code="b">Available</subfield><subfield code="8">5347753440004498</subfield></datafield></record></collection>