Normal view MARC view ISBD view

Measures of Complexity [electronic resource] : Festschrift for Alexey Chervonenkis / edited by Vladimir Vovk, Harris Papadopoulos, Alexander Gammerman.

Contributor(s): Vovk, Vladimir [editor.] | Papadopoulos, Harris [editor.] | Gammerman, Alexander [editor.] | SpringerLink (Online service).
Material type: materialTypeLabelBookPublisher: Cham : Springer International Publishing : Imprint: Springer, 2015Edition: 1st ed. 2015.Description: XXXI, 399 p. 47 illus., 30 illus. in color. online resource.Content type: text Media type: computer Carrier type: online resourceISBN: 9783319218526.Subject(s): Computer science | Mathematical statistics | Artificial intelligence | Mathematical optimization | Statistics | Computer Science | Artificial Intelligence (incl. Robotics) | Statistical Theory and Methods | Probability and Statistics in Computer Science | OptimizationAdditional physical formats: Printed edition:: No titleDDC classification: 006.3 Online resources: Click here to access online
Contents:
Chervonenkis's Recollections -- A Paper That Created Three New Fields -- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities -- Sketched History: VC Combinatorics, 1826 up to 1975 -- Institute of Control Sciences through the Lens of VC Dimension -- VC Dimension, Fat-Shattering Dimension, Rademacher Averages, and Their Applications -- Around Kolmogorov Complexity: Basic Notions and Results -- Predictive Complexity for Games with Finite Outcome Spaces -- Making Vapnik-Chervonenkis Bounds Accurate -- Comment: Transductive PAC-Bayes Bounds Seen as a Generalization of Vapnik-Chervonenkis Bounds -- Comment: The Two Styles of VC Bounds -- Rejoinder: Making VC Bounds Accurate -- Measures of Complexity in the Theory of Machine Learning -- Classes of Functions Related to VC Properties -- On Martingale Extensions of Vapnik-Chervonenkis -- Theory with Applications to Online Learning -- Measuring the Capacity of Sets of Functions in the Analysis of ERM -- Algorithmic Statistics Revisited -- Justifying Information-Geometric Causal Inference -- Interpretation of Black-Box Predictive Models -- PAC-Bayes Bounds for Supervised Classification -- Bounding Embeddings of VC Classes into Maximum Classes -- Algorithmic Statistics Revisited -- Justifying Information-Geometric Causal Inference -- Interpretation of Black-Box Predictive Models -- PAC-Bayes Bounds for Supervised Classification -- Bounding Embeddings of VC Classes into Maximum Classes -- Strongly Consistent Detection for Nonparametric Hypotheses -- On the Version Space Compression Set Size and Its Applications -- Lower Bounds for Sparse Coding -- Robust Algorithms via PAC-Bayes and Laplace Distributions -- Postscript: Tragic Death of Alexey Chervonenkis -- Credits -- Index.
In: Springer eBooksSummary: This book brings together historical notes, reviews of research developments, fresh ideas on how to make VC (Vapnik-Chervonenkis) guarantees tighter, and new technical contributions in the areas of machine learning, statistical inference, classification, algorithmic statistics, and pattern recognition. The contributors are leading scientists in domains such as statistics, mathematics, and theoretical computer science, and the book will be of interest to researchers and graduate students in these domains.
    average rating: 0.0 (0 votes)
No physical items for this record

Chervonenkis's Recollections -- A Paper That Created Three New Fields -- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities -- Sketched History: VC Combinatorics, 1826 up to 1975 -- Institute of Control Sciences through the Lens of VC Dimension -- VC Dimension, Fat-Shattering Dimension, Rademacher Averages, and Their Applications -- Around Kolmogorov Complexity: Basic Notions and Results -- Predictive Complexity for Games with Finite Outcome Spaces -- Making Vapnik-Chervonenkis Bounds Accurate -- Comment: Transductive PAC-Bayes Bounds Seen as a Generalization of Vapnik-Chervonenkis Bounds -- Comment: The Two Styles of VC Bounds -- Rejoinder: Making VC Bounds Accurate -- Measures of Complexity in the Theory of Machine Learning -- Classes of Functions Related to VC Properties -- On Martingale Extensions of Vapnik-Chervonenkis -- Theory with Applications to Online Learning -- Measuring the Capacity of Sets of Functions in the Analysis of ERM -- Algorithmic Statistics Revisited -- Justifying Information-Geometric Causal Inference -- Interpretation of Black-Box Predictive Models -- PAC-Bayes Bounds for Supervised Classification -- Bounding Embeddings of VC Classes into Maximum Classes -- Algorithmic Statistics Revisited -- Justifying Information-Geometric Causal Inference -- Interpretation of Black-Box Predictive Models -- PAC-Bayes Bounds for Supervised Classification -- Bounding Embeddings of VC Classes into Maximum Classes -- Strongly Consistent Detection for Nonparametric Hypotheses -- On the Version Space Compression Set Size and Its Applications -- Lower Bounds for Sparse Coding -- Robust Algorithms via PAC-Bayes and Laplace Distributions -- Postscript: Tragic Death of Alexey Chervonenkis -- Credits -- Index.

This book brings together historical notes, reviews of research developments, fresh ideas on how to make VC (Vapnik-Chervonenkis) guarantees tighter, and new technical contributions in the areas of machine learning, statistical inference, classification, algorithmic statistics, and pattern recognition. The contributors are leading scientists in domains such as statistics, mathematics, and theoretical computer science, and the book will be of interest to researchers and graduate students in these domains.

There are no comments for this item.

Log in to your account to post a comment.