Local cover image
Local cover image
Image from Google Jackets

Foundations of machine learning / Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar.

By: Contributor(s): Material type: TextSeries: Adaptive computation and machine learningPublisher: Cambridge, Massachusetts ; London, England : The MIT Press, [2018]Edition: Second editionDescription: xv, 486 pages : illustrations (some colour) ; 24 cmContent type:
  • text
Media type:
  • unmediated
Carrier type:
  • volume
ISBN:
  • 9780262039406
Subject(s): DDC classification:
  • 006.31 23
Contents:
1. Introduction -- 2. The PAC Learning Framework -- 3. Rademacher Complexity and VC-Dimension -- 4. Model Selection -- 5. Support Vector Machines -- 6. Kernel Methods -- 7. Boosting -- 8. On-line Learning -- 9. Multi-Class Classification -- 10. Ranking -- 11. Regression -- 12. Maximum Entropy -- 13. Conditional Maximum Entropy Models -- 14. Algorithmic Stability -- 15. Dimensionality Reduction -- 16. Learning Automata and Language -- 17. Reinforcement Learning -- Conclusion -- A. Linear Algebra Review -- B. Convex Optimization -- C. Probability Review -- D. Concentration Inequalities -- E. Notions of Information Theory F. Notation.
Summary: A new edition of a graduate-level machine learning textbook that focuses on the analysis and theory of algorithms. This book is a general introduction to machine learning that can serve as a textbook for graduate students and a reference for researchers. It covers fundamental modern topics in machine learning while providing the theoretical basis and conceptual tools needed for the discussion and justification of algorithms. It also describes several key aspects of the application of these algorithms. The authors aim to present novel theoretical tools and concepts while giving concise proofs even for relatively advanced topics. Foundations of Machine Learning is unique in its focus on the analysis and theory of algorithms. The first four chapters lay the theoretical foundation for what follows; subsequent chapters are mostly self-contained. Topics covered include the Probably Approximately Correct (PAC) learning framework; generalization bounds based on Rademacher complexity and VC-dimension; Support Vector Machines (SVMs); kernel methods; boosting; on-line learning; multi-class classification; ranking; regression; algorithmic stability; dimensionality reduction; learning automata and languages; and reinforcement learning. Each chapter ends with a set of exercises. Appendixes provide additional material including concise probability review. This second edition offers three new chapters, on model selection, maximum entropy models, and conditional entropy models. New material in the appendixes includes a major section on Fenchel duality, expanded coverage of concentration inequalities, and an entirely new entry on information theory. More than half of the exercises are new to this edition.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number Status Barcode
Books Marbella International University Centre Library 006.31 MOH fou (Browse shelf(Opens below)) Available 12222

Browsing Marbella International University Centre shelves,Shelving location: Library Close shelf browser (Hides shelf browser)

Includes bibliographical references (pages 461-474) and index.

1. Introduction --
2. The PAC Learning Framework --
3. Rademacher Complexity and VC-Dimension --
4. Model Selection --
5. Support Vector Machines --
6. Kernel Methods --
7. Boosting --
8. On-line Learning --
9. Multi-Class Classification --
10. Ranking --
11. Regression --
12. Maximum Entropy --
13. Conditional Maximum Entropy Models --
14. Algorithmic Stability --
15. Dimensionality Reduction --
16. Learning Automata and Language --
17. Reinforcement Learning --
Conclusion --
A. Linear Algebra Review --
B. Convex Optimization --
C. Probability Review --
D. Concentration Inequalities --
E. Notions of Information Theory
F. Notation.

A new edition of a graduate-level machine learning textbook that focuses on the analysis and theory of algorithms.

This book is a general introduction to machine learning that can serve as a textbook for graduate students and a reference for researchers. It covers fundamental modern topics in machine learning while providing the theoretical basis and conceptual tools needed for the discussion and justification of algorithms. It also describes several key aspects of the application of these algorithms. The authors aim to present novel theoretical tools and concepts while giving concise proofs even for relatively advanced topics.

Foundations of Machine Learning is unique in its focus on the analysis and theory of algorithms. The first four chapters lay the theoretical foundation for what follows; subsequent chapters are mostly self-contained. Topics covered include the Probably Approximately Correct (PAC) learning framework; generalization bounds based on Rademacher complexity and VC-dimension; Support Vector Machines (SVMs); kernel methods; boosting; on-line learning; multi-class classification; ranking; regression; algorithmic stability; dimensionality reduction; learning automata and languages; and reinforcement learning. Each chapter ends with a set of exercises. Appendixes provide additional material including concise probability review.

This second edition offers three new chapters, on model selection, maximum entropy models, and conditional entropy models. New material in the appendixes includes a major section on Fenchel duality, expanded coverage of concentration inequalities, and an entirely new entry on information theory. More than half of the exercises are new to this edition.

There are no comments on this title.

to post a comment.

Click on an image to view it in the image viewer

Local cover image


© Marbella International University Centre, 2024. All rights reserved.

(Koha-ILS, Implemented and customized by MIUC Library in 2015)