By: |
Roy Cerqueti (Univesity of Macerata);
Paolo Falbo (University of Brescia);
Cristian Pelizzari (University of Brescia) |
Abstract: |
<p> </p><p align="left"><font size="1">While the large portion of the
literature on Markov chain (possibly of order<br />higher than one) bootstrap
methods has focused on the correct estimation of<br />the transition
probabilities, little or no attention has been devoted to the<br />problem of
estimating the dimension of the transition probability matrix.<br />Indeed, it
is usual to assume that the Markov chain has a one-step memory<br />property
and that the state space could not to be clustered, and coincides<br />with
the distinct observed values. In this paper we question the opportunity<br
/>of such a standard approach.<br />In particular we advance a method to
jointly estimate the order of the Markov<br />chain and identify a suitable
clustering of the states. Indeed in several real<br />life applications the
"memory" of many<br />processes extends well over the last observation; in
those cases a correct<br />representation of past trajectories requires a
significantly richer set than<br />the state space. On the contrary it can
sometimes happen that some distinct<br />values do not correspond to really
"different<br />states of a process; this is a common conclusion whenever,<br
/>for example, a process assuming two distinct values in t is not affected
in<br />its distribution in t+1. Such a situation would suggest to reduce
the<br />dimension of the transition probability matrix.<br />Our methods are
based on solving two optimization problems. More specifically<br />we consider
two competing objectives that a researcher will in general pursue<br />when
dealing with bootstrapping: preserving the similarity between the<br
/>observed and the bootstrap series and reducing the probabilities of getting
a<br />perfect replication of the original sample. A brief axiomatic
discussion is<br />developed to define the desirable properties for such
optimal criteria. Two<br />numerical examples are presented to illustrate the
method.</font></p><p align="left"> </p> |
Keywords: |
order of Markov chains,similarity of time series,transition probability matrices,multiplicity of time series,partition of states of Markov chains,Markov chains,bootstrap methods |
JEL: |
C14 C15 C61 |
Date: |
2009–04 |
URL: |
http://d.repec.org/n?u=RePEc:mcr:wpdief:wpaper00053&r=ets |