Modern Optimization Methods / / Qingna LI.

With the fast development of big data and artificial intelligence, a natural question is how do we analyze data more efficiently? One of the efficient ways is to use optimization. What is optimization? Optimization exists everywhere. People optimize. As long as you have choices, you do optimization....

Full description

Saved in:
Bibliographic Details
VerfasserIn:
Place / Publishing House:Les Ulis : : EDP Sciences, , [2023]
2023
Year of Publication:2023
Language:English
Online Access:
Physical Description:1 online resource (157 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
id 9782759831753
ctrlnum (DE-B1597)677655
collection bib_alma
record_format marc
spelling LI, Qingna, author. aut http://id.loc.gov/vocabulary/relators/aut
Modern Optimization Methods / Qingna LI.
Les Ulis : EDP Sciences, [2023]
2023
1 online resource (157 p.)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
text file PDF rda
Frontmatter -- Preface -- Contents -- Chapter 1. Introduction -- Chapter 2. Fundamentals of Optimization -- Chapter 3. Line Search Methods -- Chapter 4. Trust Region Methods -- Chapter 5. Conjugate Gradient Methods -- Chapter 6. Semismooth Newton's Method -- Chapter 7. Theory of Constrained Optimization -- Chapter 8. Penalty and Augmented Lagrangian Methods -- Chapter 9. Bilevel Optimization and Its Applications -- Bibliography
restricted access http://purl.org/coar/access_right/c_16ec online access with authorization star
With the fast development of big data and artificial intelligence, a natural question is how do we analyze data more efficiently? One of the efficient ways is to use optimization. What is optimization? Optimization exists everywhere. People optimize. As long as you have choices, you do optimization. Optimization is the key of operations research. This book introduces the basic definitions and theory about numerical optimization, including optimality conditions for unconstrained and constrained optimization, as well as algorithms for unconstrained and constrained problems. Moreover, it also includes the nonsmooth Newton's method, which plays an important role in large-scale numerical optimization. Finally, based on the author's research experiences, several latest applications about optimization are introduced, including optimization algorithms for hypergraph matching, support vector machine and bilevel optimization approach for hyperparameter selection in machine learning. With these optimization tools, one can deal with data more efficiently.
Mode of access: Internet via World Wide Web.
In English.
Description based on online resource; title from PDF title page (publisher's Web site, viewed 09. Dez 2023)
MATHEMATICS / Probability & Statistics / Regression Analysis. bisacsh
https://doi.org/10.1051/978-2-7598-3175-3
https://www.degruyter.com/isbn/9782759831753
Cover https://www.degruyter.com/document/cover/isbn/9782759831753/original
language English
format eBook
author LI, Qingna,
LI, Qingna,
spellingShingle LI, Qingna,
LI, Qingna,
Modern Optimization Methods /
Frontmatter --
Preface --
Contents --
Chapter 1. Introduction --
Chapter 2. Fundamentals of Optimization --
Chapter 3. Line Search Methods --
Chapter 4. Trust Region Methods --
Chapter 5. Conjugate Gradient Methods --
Chapter 6. Semismooth Newton's Method --
Chapter 7. Theory of Constrained Optimization --
Chapter 8. Penalty and Augmented Lagrangian Methods --
Chapter 9. Bilevel Optimization and Its Applications --
Bibliography
author_facet LI, Qingna,
LI, Qingna,
author_variant q l ql
q l ql
author_role VerfasserIn
VerfasserIn
author_sort LI, Qingna,
title Modern Optimization Methods /
title_full Modern Optimization Methods / Qingna LI.
title_fullStr Modern Optimization Methods / Qingna LI.
title_full_unstemmed Modern Optimization Methods / Qingna LI.
title_auth Modern Optimization Methods /
title_alt Frontmatter --
Preface --
Contents --
Chapter 1. Introduction --
Chapter 2. Fundamentals of Optimization --
Chapter 3. Line Search Methods --
Chapter 4. Trust Region Methods --
Chapter 5. Conjugate Gradient Methods --
Chapter 6. Semismooth Newton's Method --
Chapter 7. Theory of Constrained Optimization --
Chapter 8. Penalty and Augmented Lagrangian Methods --
Chapter 9. Bilevel Optimization and Its Applications --
Bibliography
title_new Modern Optimization Methods /
title_sort modern optimization methods /
publisher EDP Sciences,
publishDate 2023
physical 1 online resource (157 p.)
contents Frontmatter --
Preface --
Contents --
Chapter 1. Introduction --
Chapter 2. Fundamentals of Optimization --
Chapter 3. Line Search Methods --
Chapter 4. Trust Region Methods --
Chapter 5. Conjugate Gradient Methods --
Chapter 6. Semismooth Newton's Method --
Chapter 7. Theory of Constrained Optimization --
Chapter 8. Penalty and Augmented Lagrangian Methods --
Chapter 9. Bilevel Optimization and Its Applications --
Bibliography
isbn 9782759831753
url https://doi.org/10.1051/978-2-7598-3175-3
https://www.degruyter.com/isbn/9782759831753
https://www.degruyter.com/document/cover/isbn/9782759831753/original
illustrated Not Illustrated
doi_str_mv 10.1051/978-2-7598-3175-3
work_keys_str_mv AT liqingna modernoptimizationmethods
status_str n
ids_txt_mv (DE-B1597)677655
carrierType_str_mv cr
is_hierarchy_title Modern Optimization Methods /
_version_ 1789654385229299712
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03373nam a22005535i 4500</leader><controlfield tag="001">9782759831753</controlfield><controlfield tag="003">DE-B1597</controlfield><controlfield tag="005">20231209095929.0</controlfield><controlfield tag="006">m|||||o||d||||||||</controlfield><controlfield tag="007">cr || ||||||||</controlfield><controlfield tag="008">231209t20232023fr fo d z eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9782759831753</subfield></datafield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1051/978-2-7598-3175-3</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-B1597)677655</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-B1597</subfield><subfield code="b">eng</subfield><subfield code="c">DE-B1597</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">fr</subfield><subfield code="c">FR</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">MAT029030</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">LI, Qingna, </subfield><subfield code="e">author.</subfield><subfield code="4">aut</subfield><subfield code="4">http://id.loc.gov/vocabulary/relators/aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Modern Optimization Methods /</subfield><subfield code="c">Qingna LI.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Les Ulis : </subfield><subfield code="b">EDP Sciences, </subfield><subfield code="c">[2023]</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">2023</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (157 p.)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="347" ind1=" " ind2=" "><subfield code="a">text file</subfield><subfield code="b">PDF</subfield><subfield code="2">rda</subfield></datafield><datafield tag="505" ind1="0" ind2="0"><subfield code="t">Frontmatter -- </subfield><subfield code="t">Preface -- </subfield><subfield code="t">Contents -- </subfield><subfield code="t">Chapter 1. Introduction -- </subfield><subfield code="t">Chapter 2. Fundamentals of Optimization -- </subfield><subfield code="t">Chapter 3. Line Search Methods -- </subfield><subfield code="t">Chapter 4. Trust Region Methods -- </subfield><subfield code="t">Chapter 5. Conjugate Gradient Methods -- </subfield><subfield code="t">Chapter 6. Semismooth Newton's Method -- </subfield><subfield code="t">Chapter 7. Theory of Constrained Optimization -- </subfield><subfield code="t">Chapter 8. Penalty and Augmented Lagrangian Methods -- </subfield><subfield code="t">Chapter 9. Bilevel Optimization and Its Applications -- </subfield><subfield code="t">Bibliography</subfield></datafield><datafield tag="506" ind1="0" ind2=" "><subfield code="a">restricted access</subfield><subfield code="u">http://purl.org/coar/access_right/c_16ec</subfield><subfield code="f">online access with authorization</subfield><subfield code="2">star</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">With the fast development of big data and artificial intelligence, a natural question is how do we analyze data more efficiently? One of the efficient ways is to use optimization. What is optimization? Optimization exists everywhere. People optimize. As long as you have choices, you do optimization. Optimization is the key of operations research. This book introduces the basic definitions and theory about numerical optimization, including optimality conditions for unconstrained and constrained optimization, as well as algorithms for unconstrained and constrained problems. Moreover, it also includes the nonsmooth Newton's method, which plays an important role in large-scale numerical optimization. Finally, based on the author's research experiences, several latest applications about optimization are introduced, including optimization algorithms for hypergraph matching, support vector machine and bilevel optimization approach for hyperparameter selection in machine learning. With these optimization tools, one can deal with data more efficiently.</subfield></datafield><datafield tag="538" ind1=" " ind2=" "><subfield code="a">Mode of access: Internet via World Wide Web.</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">In English.</subfield></datafield><datafield tag="588" ind1="0" ind2=" "><subfield code="a">Description based on online resource; title from PDF title page (publisher's Web site, viewed 09. Dez 2023)</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">MATHEMATICS / Probability &amp; Statistics / Regression Analysis.</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1051/978-2-7598-3175-3</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://www.degruyter.com/isbn/9782759831753</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="3">Cover</subfield><subfield code="u">https://www.degruyter.com/document/cover/isbn/9782759831753/original</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_CL_MTPY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_EBKALL</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_ECL_MTPY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_EEBKALL</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_ESTMALL</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_PPALL</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">EBA_STMALL</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV-deGruyter-alles</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">PDA12STME</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">PDA13ENGE</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">PDA18STMEE</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">PDA5EBK</subfield></datafield></record></collection>