on Econometrics
 Issue of 2006‒08‒19 three papers chosen by Sune Karlsson Orebro University

1.  By: BARRIENTOS-MARÍN, Jorge Abstract: This work presents a tool for the additivity test. The additive model is widely used for parametric and semiparametric modeling of economic data. The additivity hypothesis is of interest because it is easy to interpret and produces reasonably fast convergence rates for non-parametric estimators. Another advantage of additive models is that they allow attacking the problem of the curse of dimensionality that arises in non- parametric estimation. Hypothesis testing is based in the well-known bootstrap residual process. In nonparametric testing literature, the dominant idea is that bandwidth utilized to produce bootstrap sample should be bigger that bandwidth for estimating model under null hypothesis. However, there is no hint so far about how to choose such bandwidth in practice. We will discuss a first step to find some rule of thumb to choose bandwidth in that context. Our suggestions are accompanied by simulation studies. Date: 2005–04–01 URL: http://d.repec.org/n?u=RePEc:col:001065:002618&r=ecm
2.  By: Stefano Iacus (Department of Economics, Business and Statistics, University of Milan, IT); Alessandro De Gregorio (Department of Statistics, University of Padova) Abstract: The telegraph process $X(t)$, $t>0$, (Goldstein, 1951) and the geometric telegraph process $S(t) = s_0 \exp\{(\mu -\frac12\sigma^2)t + \sigma X(t)\}$ with $\mu$ a known constant and $\sigma>0$ a parameter are supposed to be observed at $n+1$ equidistant time points $t_i=i\Delta_n,i=0,1,\ldots, n$. For both models $\lambda$, the underlying rate of the Poisson process, is a parameter to be estimated. In the geometric case, also $\sigma>0$ has to be estimated. We propose different estimators of the parameters and we investigate their performance under the high frequency asymptotics, i.e. $\Delta_n \to 0$, $n\Delta = T<\infty$ as $n \to \infty$, with $T>0$ fixed. The process $X(t)$ in non markovian, non stationary and not ergodic thus we use approximation arguments to derive estimators. Given the complexity of the equations involved only estimators on the first model can be studied analytically. Therefore, we run an extensive Monte Carlo analysis to study the performance of the proposed estimators also for small sample size $n$. Keywords: telegraph process, discretely observed process, inference for stochastic processes, Date: 2006–07–25 URL: http://d.repec.org/n?u=RePEc:bep:unimip:1033&r=ecm
3.  By: Maxim S. Finkelstein (Max Planck Institute for Demographic Research, Rostock, Germany); Veronica Esaulova Abstract: A bivariate competing risks problem is considered for a rather general class of survival models. The lifetime distribution of each component is indexed by a frailty parameter. Under the assumption of conditional independence of components the correlated frailty model is considered. The explicit asymptotic formula for the mixture failure rate of a system is derived. It is proved that asymptotically the remaining lifetimes of components tend to be independent in the defined sense. Some simple examples are discussed. JEL: J1 Z0 Date: 2006–08 URL: http://d.repec.org/n?u=RePEc:dem:wpaper:wp-2006-023&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.