Normal view MARC view ISBD view

Mathematical Theories of Machine Learning - Theory and Applications [electronic resource] / by Bin Shi, S. S. Iyengar.

By: Shi, Bin [author.].
Contributor(s): Iyengar, S. S [author.] | SpringerLink (Online service).
Material type: materialTypeLabelBookPublisher: Cham : Springer International Publishing : Imprint: Springer, 2020Edition: 1st ed. 2020.Description: XXI, 133 p. 25 illus., 24 illus. in color. online resource.Content type: text Media type: computer Carrier type: online resourceISBN: 9783030170769.Subject(s): Telecommunication | Computational intelligence | Data mining | Information storage and retrieval systems | Quantitative research | Communications Engineering, Networks | Computational Intelligence | Data Mining and Knowledge Discovery | Information Storage and Retrieval | Data Analysis and Big DataAdditional physical formats: Printed edition:: No title; Printed edition:: No title; Printed edition:: No titleDDC classification: 621.382 Online resources: Click here to access online
Contents:
Chapter 1. Introduction -- Chapter 2. General Framework of Mathematics -- Chapter 3. Problem Formulation -- Chapter 4. Development of Novel Techniques of CoCoSSC Method -- Chapter 5. Further Discussions of the Proposed Method -- Chapter 6. Related Work on Geometry of Non-Convex Programs -- Chapter 7. Gradient Descent Converges to Minimizers -- Chapter 8. A Conservation Law Method Based on Optimization -- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations -- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series -- Chapter 11. Conclusion.
In: Springer Nature eBookSummary: This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data.
    average rating: 0.0 (0 votes)
No physical items for this record

Chapter 1. Introduction -- Chapter 2. General Framework of Mathematics -- Chapter 3. Problem Formulation -- Chapter 4. Development of Novel Techniques of CoCoSSC Method -- Chapter 5. Further Discussions of the Proposed Method -- Chapter 6. Related Work on Geometry of Non-Convex Programs -- Chapter 7. Gradient Descent Converges to Minimizers -- Chapter 8. A Conservation Law Method Based on Optimization -- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations -- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series -- Chapter 11. Conclusion.

This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data.

There are no comments for this item.

Log in to your account to post a comment.