Boosting : : foundations and algorithms / / Robert E. Schapire and Yoav Freund.
Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate "rules of thumb." A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, conv...
Saved in:
Superior document: | Adaptive computation and machine learning series |
---|---|
VerfasserIn: | |
TeilnehmendeR: | |
Place / Publishing House: | Cambridge, Massachusetts : : MIT Press,, c2012. [Piscataqay, New Jersey] : : IEEE Xplore,, [2012] |
Year of Publication: | 2012 |
Language: | English |
Series: | Adaptive computation and machine learning
|
Physical Description: | 1 online resource (544 p.) |
Notes: | Bibliographic Level Mode of Issuance: Monograph |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Table of Contents:
- Foundations of machine learning
- Using AdaBoost to minimize training error
- Direct bounds on the generalization error
- The margins explanation for boosting's effectiveness
- Game theory, online learning, and boosting
- Loss minimization and generalizations of boosting
- Boosting, convex optimization, and information geometry
- Using confidence-rated weak predictions
- Multiclass classification problems
- Learning to rank
- Attaining the best possible accuracy
- Optimally efficient boosting
- Boosting in continuous time.