Logo
Munich Personal RePEc Archive

Relevant States and Memory in Markov Chain Bootstrapping and Simulation

Cerqueti, Roy and Falbo, Paolo and Pelizzari, Cristian (2010): Relevant States and Memory in Markov Chain Bootstrapping and Simulation.

This is the latest version of this item.

[thumbnail of MPRA_paper_46250.pdf]
Preview
PDF
MPRA_paper_46250.pdf

Download (432kB) | Preview

Abstract

Markov chain theory is proving to be a powerful approach to bootstrap highly nonlinear time series. In this work we provide a method to estimate the memory of a Markov chain (i.e. its order) and to identify its relevant states. In particular the choice of memory lags and the aggregation of irrelevant states are obtained by looking for regularities in the transition probabilities. Our approach is based on an optimization model. More specifically we consider two competing objectives that a researcher will in general pursue when dealing with bootstrapping: preserving the “structural” similarity between the original and the simulated series and assuring a controlled diversification of the latter. A discussion based on information theory is developed to define the desirable properties for such optimal criteria. Two numerical tests are developed to verify the effectiveness of the method proposed here.

Available Versions of this Item

Atom RSS 1.0 RSS 2.0

Contact us: mpra@ub.uni-muenchen.de

This repository has been built using EPrints software.

MPRA is a RePEc service hosted by Logo of the University Library LMU Munich.