Computational Statistics in Data Science. Группа авторов. Читать онлайн. Newlib. NEWLIB.NET

Автор: Группа авторов
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Математика
Год издания: 0
isbn: 9781119561088
Скачать книгу
Gamma left-parenthesis StartFraction n Over 2 EndFraction comma one half sigma-summation Underscript i equals 1 Overscript 8 Endscripts left-parenthesis y Subscript i Baseline minus beta 1 left-parenthesis 1 minus e Superscript minus beta 2 x Super Subscript i Superscript Baseline right-parenthesis right-parenthesis squared right-parenthesis"/>

      and

normal pi left-parenthesis beta vertical-bar y right-parenthesis proportional-to StartAbsoluteValue upper V Superscript upper T Baseline upper V EndAbsoluteValue Superscript 1 slash 2 Baseline left-parenthesis sigma-summation Underscript i equals 1 Overscript 8 Endscripts left-parenthesis y Subscript i Baseline minus beta 1 left-parenthesis 1 minus e Superscript minus beta 2 x Super Subscript i Superscript Baseline right-parenthesis right-parenthesis squared right-parenthesis Superscript negative n slash 2

      Repeating the previous procedure with the linchpin sampler, we have an estimated ESS at n Superscript asterisk Baseline equals 8123 of 652, and the sequential stopping rule terminates at n equals 183 122. The resulting estimates of posterior mean and quantiles are similar. Thus, using a more efficient sampler requires substantially fewer iterations to obtain estimates of similar quality.

stat08283fgz004

      Note

      1 1 Caffo, B.S., Booth, J.G., and Davison, A.C. (2002) Empirical sup rejection sampling. Biometrika, 89, 745–754.

      2 2 Chib, S. and Greenberg, E. (1995) Understanding the Metropolis‐Hastings algorithm. Am. Stat., 49, 327–335.

      3 3 Fishman, G.S. (1996) Monte Carlo: Concepts, Algorithms, and Applications, Springer, New York.

      4 4 Robert, C.P. and Casella, G. (2013) Monte Carlo Statistical Methods, Springer, New York.

      5 5 Robert, C.P., Elvira, V., Tawn, N., and Wu, C. (2018) Accelerating MCMC algorithms. Wiley Interdiscip. Rev. Comput. Stat., 10, e1435.

      6 6 Flegal, J.M., Haran, M., and Jones, G.L. (2008) Markov chain Monte Carlo: can we trust the third significant figure? Stat. Sci., 23, 250–260.

      7 7 Koehler, E., Brown, E., and Haneuse, S.J.‐P. (2009) On the assessment of Monte Carlo error in simulation‐based statistical analyses. Am. Stat., 63, 155–162.

      8 8 Frey, J. (2010) Fixed‐width sequential confidence intervals for a proportion. Am. Stat., 64, 242–249.

      9 9 Roberts, G.O. and Rosenthal, J.S. (2004) General state space Markov chains and MCMC algorithms. Probab. Surv., 1, 20–71.

      10 10 Robertson, N., Flegal, J.M., Vats, D., and Jones, G.L. (2019) Assessing and visualizing simultaneous simulation error. arXiv preprint arXiv:1904.11912.

      11 11 Doss, C.R., Flegal, J.M., Jones, G.L., and Neath, R.C. (2014) Markov chain Monte Carlo estimation of quantiles. Electron. J. Stat., 8, 2448–2478.

      12 12 Brooks, S.P. and Gelman, A. (1998) General methods for monitoring convergence of iterative simulations. J. Comput. Graph. Stat., 7, 434–455.

      13 13 Jones, G.L. (2004) On the Markov chain central limit theorem. Probab. Surv., 1, 299–320.

      14 14 Andrews, D.W. (1991) Heteroskedasticity and autocorrelation consistent covariance matrix estimation. Econometrica, 59, 817–858.

      15 15 Vats, D., Flegal, J.M., and Jones, G.L. (2018) Strong consistency of multivariate spectral variance estimators in Markov chain Monte Carlo. Bernoulli, 24, 1860–1909.

      16 16 Seila, A.F. (1982) Multivariate estimation in regenerative simulation. Oper. Res. Lett., 1, 153–156.

      17 17 Hobert, J.P., Jones, G.L., Presnell, B., and Rosenthal, J.S. (2002) On the applicability of regenerative simulation in Markov chain Monte Carlo. Biometrika, 89, 731–743.

      18 18 Geyer, C.J. (1992) Practical Markov chain Monte Carlo (with discussion). Stat. Sci., 7, 473–511.

      19 19 Dai, N. and Jones, G.L. (2017) Multivariate initial sequence estimators in Markov chain Monte Carlo. J. Multivar. Anal., 159, 184–199.

      20 20 Kosorok, M.R. (2000) Monte Carlo error estimation for multivariate Markov chains. Stat. Probab. Lett., 46, 85–93.

      21 21 Chen, D.‐F.R. and Seila, A.F. (1987) Multivariate Inference in Stationary Simulation Using Batch Means. Proceedings of the 19th Conference on Winter simulation, pp. 302–304. ACM.

      22 22 Jones, G.L., Haran, M., Caffo, B.S., and Neath, R. (2006) Fixed‐width output analysis for Markov chain Monte Carlo. J. Am. Stat. Assoc., 101, 1537–1547.

      23 23 Chien, C.‐H. (1988) Small sample theory for steady state confidence intervals, in Proceedings of the Winter Simulation Conference (eds M. Abrams, P. Haigh, and J. Comfort), Association for Computing Machinery, New York, NY, USA, pp. 408–413, doi: https://doi.org/10.1145/318123.318225.

      24 24 Chien, C.‐H., Goldsman, D., and Melamed, B. (1997) Large‐sample results for batch means. Manage. Sci., 43, 1288–1295.

      25 25 Flegal, J.M. and Jones, G.L. (2010) Batch means and spectral variance estimators in Markov chain Monte Carlo. Ann. Stat., 38, 1034–1070.

      26 26 Vats, D., Flegal, J.M., and Jones, G.L. (2019) Multivariate output analysis for Markov chain Monte Carlo. Biometrika, 106, 321–337.

      27 27 Vats, D. and Flegal, J.M. (2020) Lugsail lag windows for estimating time‐average covariance matrices. arXiv preprint arXiv:1809.04541.

      28 28 Liu, Y. and Flegal, J.M. (2018) Weighted batch means estimators in Markov chain Monte Carlo. Electron. J. Stat., 12, 3397–3442.

      29 29 Glynn, P.W. and Whitt, W. (1992) The asymptotic validity of sequential stopping rules for stochastic simulations. Ann. Appl. Probab., 2, 180–198.

      30 30 Jarner, S.F. and Hansen, E. (2000) Geometric ergodicity of Metropolis algorithms. Stoch. Proc. Appl., 85, 341–361.

      31 31 Roberts, G.O. and Tweedie, R.L. (1996) Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika, 83, 95–110.

      32 32 Vats, D. (2017) Geometric ergodicity of Gibbs samplers in Bayesian penalized regression models. Electron. J. Stat., 11, 4033–4064.

      33 33 Khare, K. and Hobert, J.P. (2013) Geometric ergodicity of the Bayesian lasso. Electron. J. Stat., 7, 2150–2163.

      34 34 Tan, A., Jones, G.L., and Hobert, J.P. (2013) On the geometric ergodicity of two‐variable Gibbs samplers, in Advances in Modern Statistical Theory and Applications: A Festschrift in Honor of Morris L. Eaton (eds G. L. Jones and X. Shen), Institute of Mathematical Statistics,