Optimal nonparametric inference via deep neural network

Ruiqi Liu, Ben Boukai, Zuofeng Shang

Research output: Contribution to journalArticlepeer-review

Abstract

Deep neural network is a state-of-art method in modern science and technology. Much statistical literature have been devoted to understanding its performance in nonparametric estimation, whereas the results are suboptimal due to a redundant logarithmic sacrifice. In this paper, we show that such log-factors are not necessary. We derive upper bounds for the L2 minimax risk in nonparametric estimation. Sufficient conditions on network architectures are provided such that the upper bounds become optimal (without log-sacrifice). Our proof relies on an explicitly constructed network estimator based on tensor product B-splines. We also derive asymptotic distributions for the constructed network and a relating hypothesis testing procedure. The testing procedure is further proved as minimax optimal under suitable network architectures.

Original languageEnglish (US)
Article number125561
JournalJournal of Mathematical Analysis and Applications
Volume505
Issue number2
DOIs
StatePublished - Jan 15 2022

All Science Journal Classification (ASJC) codes

  • Analysis
  • Applied Mathematics

Keywords

  • Asymptotic distribution
  • Deep neural network
  • Nonparametric inference
  • Nonparametric testing
  • Optimal minimax risk bound
  • Tensor product B-splines

Fingerprint

Dive into the research topics of 'Optimal nonparametric inference via deep neural network'. Together they form a unique fingerprint.

Cite this