Korobilis, Dimitris and Pettenuzzo, Davide (2020): Machine Learning Econometrics: Bayesian algorithms and methods.
Preview |
PDF
KP3_2020.04.19.pdf Download (426kB) | Preview |
Abstract
As the amount of economic and other data generated worldwide increases vastly, a challenge for future generations of econometricians will be to master efficient algorithms for inference in empirical models with large information sets. This Chapter provides a review of popular estimation algorithms for Bayesian inference in econometrics and surveys alternative algorithms developed in machine learning and computing science that allow for efficient computation in high-dimensional settings. The focus is on scalability and parallelizability of each algorithm, as well as their ability to be adopted in various empirical settings in economics and finance.
Item Type: | MPRA Paper |
---|---|
Original Title: | Machine Learning Econometrics: Bayesian algorithms and methods |
Language: | English |
Keywords: | MCMC; approximate inference; scalability; parallel computation |
Subjects: | C - Mathematical and Quantitative Methods > C1 - Econometric and Statistical Methods and Methodology: General > C11 - Bayesian Analysis: General C - Mathematical and Quantitative Methods > C1 - Econometric and Statistical Methods and Methodology: General > C15 - Statistical Simulation Methods: General C - Mathematical and Quantitative Methods > C4 - Econometric and Statistical Methods: Special Topics > C49 - Other C - Mathematical and Quantitative Methods > C8 - Data Collection and Data Estimation Methodology ; Computer Programs > C88 - Other Computer Software |
Item ID: | 100165 |
Depositing User: | Dimitris Korobilis |
Date Deposited: | 06 May 2020 14:14 |
Last Modified: | 06 May 2020 14:14 |
References: | Angelino, E., Johnson, M. J. and R. P. Adams (2016). Patterns of scalable Bayesian inference. Foundations and TrendsĀ® in Machine Learning, 9(2-3), 119-247. Bardenet, R., Doucet, A. and Holmes, C. (2017). On Markov chain Monte Carlo methods for tall data. Journal of Machine Learning Research, 18, 1-43. Berger, J. O. (1985). Statistical decision theory and Bayesian analysis. Second Edition, Springer-Verlag: New York. Blei, D. M., Kucukelbir, A. and McAulie, J. D. (2017). Variational Inference: A Review for Statisticians Journal of the American Statistical Association, 112(518), 859-877. Chib, S. and Greenberg, E. (1995). Understanding the Metropolis-Hastings Algorithm. The American Statistician, 49(4), 327-335. Craiu, R., Rosenthal, J. and Yang, C. (2009). Learn from thy neighbor: Parallel-chain and regional adaptive MCMC. Journal of the American Statistical Association, 104(488), 1454-1466. Frazier, D.T., Maneesoonthorn, W., Martin, G.M. and McCabe, B.P.M. (2019). Approximate Bayesian forecasting. International Journal of Forecasting, 35, 521- 539. Gelman, A. and Rubin, D. (1992). Inference from iterative simulation using multiple sequences. Statistical Science, 7(4), 457-472. Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6), 1317-1339. Girolami, M. and Calderhead, B. (2011). Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73, 123-214. Goodman, J. and Weare, J. (2010). Ensemble samplers with affine invariance. Communications in Applied Mathematics and Computational Science, 5(1), 65-80. Green, P. J., Latuszynski, K., Pereyra, M. and Robert, C. P. (2015). Bayesian computation: a summary of the current state, and samples backwards and forwards. Statistics and Computing 25(4), 835-862. Kim, S., Shephard, N. and Chib, S. (1998). Stochastic volatility: Likelihood inference and comparison with ARCH models. The Review of Economic Studies 65(3), 361-393. Koop, G., Korobilis, D. and Pettenuzzo, D. (2019). Bayesian compressed vector autoregressions. Journal of Econometrics 210, 135-154. Korobilis, D. (2020). High-dimensional macroeconomic forecasting using message passing algorithms. Journal of Business and Economic Statistics, forthcoming. Korobilis, D. and Pettenuzzo, D. (2019). Adaptive hierarchical priors for highdimensional vector autoregressions. Journal of Econometrics, 212, 241-271. Laplace P. - S. (1774). Memoire sur la Probabilite des Causes par les Evenements. l'Academie Royale des Sciences, 6, 621-656. English translation by S.M. Stigler in 1986 as "Memoir on the Probability of the Causes of Events" in Statistical Science, 1(3), 359-378. Lindley, D. V. (1980). Approximate Bayesian methods. Trabajos de Estadistica Y de Investigacion Operativa, 31, 223-245. Liu, J. S., Liang, F. and Wong, W. H. (2000). The multiple-try method and local optimization in Metropolis sampling. Journal of the American Statistical Association, 9 (449), 121-134. Malewicz, G., Austern, Matthew H. Bik, A. J. C., Dehnert, J. C., Horn, I., Leiser, N., and Czajkowski, G. (2010). Pregel: A system for large-scale graph processing. In SIGMOD'10, 135-145. Naylor, J. and Smith, A. (1982). Applications of a Method for the Efficient Computation of Posterior Distributions. Journal of the Royal Statistical Society. Series C (Applied Statistics), 31(3), 214-225. Rasmussen, C. E. and Ghahramani, Z. (2002). Bayesian Monte Carlo. In Proceedings of the 15th International Conference on Neural Information Processing Systems (NIPS'02). MIT Press, Cambridge, MA, USA, 505-512. Ritter, C. and Tanner, M. (1992). Facilitating the Gibbs Sampler: the Gibbs Stopper and the Griddy-Gibbs sampler. Journal of the American Statistical Association, 87, 861-868. Rockova, V. and George, E. (2014). EMVS: The EM approach to Bayesian variable selection. Journal of the American Statistical Association 109(506), 828-846. Scott, S. L., Blocker, A. W., Bonassi, F. W., Chipman, H. A., George, E. I. and McCulloch, R. E. (2016). Bayes and Big Data: The consensus Monte Carlo algorithm. International Journal of Management Science and Engineering Management 11, 78-88. Sisson, S. A., Fan, Y. and Beaumont, M. A. (2018). Overview of approximate Bayesian computation. arXiv: 1802.09720v1. Solonen, A., Ollinaho, P., Laine, M., Haario, H., Tamminen, J. and Jarvinen, H. (2012). Efficient MCMC for climate model parameter estimation: Parallel adaptive chains and early rejection. Bayesian Analysis, 7(2), 1-22. Tipping, M. (2001). Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research 1, 211-244. Wang, X. and Dunson, D. B. (2013). Parallel MCMC via Weierstrass sampler. ArXiv preprint, arXiv:1312.4605, 2013. Zhu, J, Chen, J. Hu, W., and Zhang, B. (2017). Big learning with Bayesian methods. arXiv:1411.6370v2. |
URI: | https://mpra.ub.uni-muenchen.de/id/eprint/100165 |