Some structural complexity aspects of neural computation

Jose L. Balcazar, Ricard Gavalda, Hava T. Siegelmann, Eduardo D. Sontag

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Scopus citations

Abstract

Recent work by Siegelmann and Sontag has demonstrated that polynomial time on linear saturated recurrent neural networks equals polynomial time on standard computational models: Turing machines if the weights of the net are rationals, and nonuniform circuits if the weights are reals. Here we develop further connections between the languages recognized by such neural nets and other complexity classes. We present connections to space-bounded classes, simulation of parallel computational models such as vector machines, and a discussion of the characterizations of various nonuniform classes in terms of Kolmogorov complexity.

Original languageAmerican English
Title of host publicationProceedings of the Eighth Annual Structure in Complexity Theory Conference
Editors Anon
PublisherPubl by IEEE
Pages253-265
Number of pages13
ISBN (Print)0818640715
StatePublished - 1993
EventProceedings of the Eighth Annual Structure in Complexity Theory Conference - San Diego, California
Duration: May 18 1993May 21 1993

Publication series

NameProceedings of the Eighth Annual Structure in Complexity Theory Conference

Other

OtherProceedings of the Eighth Annual Structure in Complexity Theory Conference
CitySan Diego, California
Period5/18/935/21/93

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Some structural complexity aspects of neural computation'. Together they form a unique fingerprint.

Cite this