RANDOM NEURAL NETWORKS IN THE INFINITE WIDTH LIMIT AS GAUSSIAN PROCESSES

Research output: Contribution to journalArticlepeer-review

Abstract

This article gives a new proof that fully connected neural networks with random weights and biases converge to Gaussian processes in the regime where the input dimension, output dimension, and depth are kept fixed, while the hidden layer widths tend to infinity. Unlike prior work, convergence is shown assuming only moment conditions for the distribution of weights and for quite general nonlinearities.

Original languageAmerican English
Pages (from-to)4798-4819
Number of pages22
JournalAnnals of Applied Probability
Volume33
Issue number6A
DOIs
StatePublished - Dec 2023

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Gaussian processes
  • Neural networks
  • limit theorems

Fingerprint

Dive into the research topics of 'RANDOM NEURAL NETWORKS IN THE INFINITE WIDTH LIMIT AS GAUSSIAN PROCESSES'. Together they form a unique fingerprint.

Cite this