A cluster-sample approach for Monte Carlo integration using multiple samplers

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

A computational problem in many fields is to estimate simultaneously multiple integrals and expectations, assuming that the data are generated by some Monte Carlo algorithm. Consider two scenarios in which draws are simulated from multiple distributions but the normalizing constants of those distributions may be known or unknown. For each scenario, existing estimators can be classified as using individual samples separately or using all the samples jointly. The latter pooled-sample estimators are statistically more efficient but computationally more costly to evaluate than the separate-sample estimators. We develop a cluster-sample approach to obtain computationally effective estimators, after draws are generated for each scenario. We divide all the samples into mutually exclusive clusters and combine samples from each cluster separately. Furthermore, we exploit a relationship between estimators based on samples from different clusters to achieve variance reduction. The resulting estimators, compared with the pooled-sample estimators, typically yield similar statistical efficiency but have reduced computational cost. We illustrate the value of the new approach by two examples for an Ising model and a censored Gaussian random field.

Original languageEnglish (US)
Pages (from-to)151-173
Number of pages23
JournalCanadian Journal of Statistics
Volume41
Issue number1
DOIs
StatePublished - Mar 1 2013

Fingerprint

Monte Carlo Integration
Estimator
Scenarios
Normalizing Constant
Monte Carlo integration
Mutually exclusive
Gaussian Random Field
Variance Reduction
Multiple integral
Monte Carlo Algorithm
Ising Model
Divides
Computational Cost
Unknown

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Bridge sampling
  • Control variate
  • Importance sampling
  • Markov chain Monte Carlo
  • Normalizing constant
  • Path sampling

Cite this

@article{39b2b4581b4e4c4b950bd0619ee6d934,
title = "A cluster-sample approach for Monte Carlo integration using multiple samplers",
abstract = "A computational problem in many fields is to estimate simultaneously multiple integrals and expectations, assuming that the data are generated by some Monte Carlo algorithm. Consider two scenarios in which draws are simulated from multiple distributions but the normalizing constants of those distributions may be known or unknown. For each scenario, existing estimators can be classified as using individual samples separately or using all the samples jointly. The latter pooled-sample estimators are statistically more efficient but computationally more costly to evaluate than the separate-sample estimators. We develop a cluster-sample approach to obtain computationally effective estimators, after draws are generated for each scenario. We divide all the samples into mutually exclusive clusters and combine samples from each cluster separately. Furthermore, we exploit a relationship between estimators based on samples from different clusters to achieve variance reduction. The resulting estimators, compared with the pooled-sample estimators, typically yield similar statistical efficiency but have reduced computational cost. We illustrate the value of the new approach by two examples for an Ising model and a censored Gaussian random field.",
keywords = "Bridge sampling, Control variate, Importance sampling, Markov chain Monte Carlo, Normalizing constant, Path sampling",
author = "Zhiqiang Tan",
year = "2013",
month = "3",
day = "1",
doi = "https://doi.org/10.1002/cjs.11147",
language = "English (US)",
volume = "41",
pages = "151--173",
journal = "Canadian Journal of Statistics",
issn = "0319-5724",
publisher = "Wiley-Blackwell",
number = "1",

}

A cluster-sample approach for Monte Carlo integration using multiple samplers. / Tan, Zhiqiang.

In: Canadian Journal of Statistics, Vol. 41, No. 1, 01.03.2013, p. 151-173.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A cluster-sample approach for Monte Carlo integration using multiple samplers

AU - Tan, Zhiqiang

PY - 2013/3/1

Y1 - 2013/3/1

N2 - A computational problem in many fields is to estimate simultaneously multiple integrals and expectations, assuming that the data are generated by some Monte Carlo algorithm. Consider two scenarios in which draws are simulated from multiple distributions but the normalizing constants of those distributions may be known or unknown. For each scenario, existing estimators can be classified as using individual samples separately or using all the samples jointly. The latter pooled-sample estimators are statistically more efficient but computationally more costly to evaluate than the separate-sample estimators. We develop a cluster-sample approach to obtain computationally effective estimators, after draws are generated for each scenario. We divide all the samples into mutually exclusive clusters and combine samples from each cluster separately. Furthermore, we exploit a relationship between estimators based on samples from different clusters to achieve variance reduction. The resulting estimators, compared with the pooled-sample estimators, typically yield similar statistical efficiency but have reduced computational cost. We illustrate the value of the new approach by two examples for an Ising model and a censored Gaussian random field.

AB - A computational problem in many fields is to estimate simultaneously multiple integrals and expectations, assuming that the data are generated by some Monte Carlo algorithm. Consider two scenarios in which draws are simulated from multiple distributions but the normalizing constants of those distributions may be known or unknown. For each scenario, existing estimators can be classified as using individual samples separately or using all the samples jointly. The latter pooled-sample estimators are statistically more efficient but computationally more costly to evaluate than the separate-sample estimators. We develop a cluster-sample approach to obtain computationally effective estimators, after draws are generated for each scenario. We divide all the samples into mutually exclusive clusters and combine samples from each cluster separately. Furthermore, we exploit a relationship between estimators based on samples from different clusters to achieve variance reduction. The resulting estimators, compared with the pooled-sample estimators, typically yield similar statistical efficiency but have reduced computational cost. We illustrate the value of the new approach by two examples for an Ising model and a censored Gaussian random field.

KW - Bridge sampling

KW - Control variate

KW - Importance sampling

KW - Markov chain Monte Carlo

KW - Normalizing constant

KW - Path sampling

UR - http://www.scopus.com/inward/record.url?scp=84873991109&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84873991109&partnerID=8YFLogxK

U2 - https://doi.org/10.1002/cjs.11147

DO - https://doi.org/10.1002/cjs.11147

M3 - Article

VL - 41

SP - 151

EP - 173

JO - Canadian Journal of Statistics

JF - Canadian Journal of Statistics

SN - 0319-5724

IS - 1

ER -