Parberry, Ian,

Circuit complexity and neural networks / Ian Parberry. - 1 PDF (xxix, 270 pages) : illustrations. - Foundations of computing . - Foundations of computing. .

Includes bibliographical references (p. [251]-257) and index.

Restricted to subscribers or individual electronic text purchasers.

Neural networks usually work adequately on small problems but can run into trouble when they are scaled up to problems involving large amounts of input data. Circuit Complexity and Neural Networks addresses the important question of how well neural networks scale - that is, how fast the computation time and number of neurons grow as the problem size increases. It surveys recent research in circuit complexity (a robust branch of theoretical computer science) and applies this work to a theoretical understanding of the problem of scalability.Most research in neural networks focuses on learning, yet it is important to understand the physical limitations of the network before the resources needed to solve a certain problem can be calculated. One of the aims of this book is to compare the complexity of neural networks and the complexity of conventional computers, looking at the computational ability and resources (neurons and time) that are a necessary part of the foundations of neural network learning.Circuit Complexity and Neural Networks contains a significant amount of background material on conventional complexity theory that will enable neural network scientists to learn about how complexity theory applies to their discipline, and allow complexity theorists to see how their discipline applies to neural networks.




Mode of access: World Wide Web

9780262281249


Logic circuits.
Computational complexity.
Neural networks (Computer science)


Electronic books.

QA76.87 / .P38 1994eb