|
on Econometrics |
By: | Stan Hurn (QUT); Andrew McClelland (QUT); Kenneth Lindsay (University of Glasgow) |
Abstract: | This paper develops a quasi-maximum likelihood (QML) procedure for estimating the parameters of multi-dimensional stochastic differential equations. The transitional density is taken to be a time-varying multivariate Gaussian where the first two moments of the distribution are approximately the true moments of the unknown transitional density. For affine drift and diffusion functions, the moments are shown to be exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good. The estimation procedure is easily generalizable to models with latent factors, such as the stochastic volatility class of model. The QML method is as effective as alternative methods when proxy variables are used for unobserved states. A conditioning estimation procedure is also developed that allows parameter estimation in the absence of proxies. |
Keywords: | stochastic differential equations, parameter estimation, quasi-maximum likelihood, moments |
JEL: | C22 C52 |
Date: | 2010–10–28 |
URL: | http://d.repec.org/n?u=RePEc:qut:auncer:2010_12&r=ecm |
By: | Raquel Carrasco (Department of Economics, Universidad Carlos III de Madrid); José Ignacio García Pérez (Department of Economics, Universidad Pablo de Olavide) |
Abstract: | This paper considers the estimation of discrete time duration models. We highlight the enhance identification opportunities embedded in multiple spell data to separately identify the effect of duration dependence and individual time invariant unobserved heterogeneity. We consider two types of models: (i) random effects models specifying a mass point distribution for the unobserved heterogeneity; and (ii) fixed effects models in which the distribution of the effects is left unrestricted. The availability of multiple spell data allows us to consider this type of models, in the spirit of fixed effects discrete choice panel data models. We study the finite sample properties of different estimators for previous models by means of Monte Carlo simulations. Finally, as an empirical illustration, we estimate unemployment duration models using Spanish administrative data with information on the entire labor history of the individuals. |
Keywords: | Duration models; Discrete choice; Multiple spells; Unobserved heterogeneity; Unemployment. |
JEL: | J64 J61 C23 C41 J65 |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:pab:wpaper:10.11&r=ecm |
By: | Elena Andreou; Bas J.M. Werker |
Abstract: | This paper presents an alternative method to derive the limiting distribution of residual-based statistics. Our method does not impose an explicit assumption of (asymptotic) smoothness of the statistic of interest with respect to the model's parameters. and, thus, is especially useful in cases where such smoothness is difficult to establish. Instead, we use a locally uniform convergence in distribution condition, which is automatically satisfied by residual-based specification test statistics. To illustrate, we derive the limiting distribution of a new functional form specification test for discrete choice models, as well as a runs-based tests for conditional symmetry in dynamic volatility models. |
Keywords: | Le Cam's third lemma, Local Asymptotic Normality (LAN) |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:ucy:cypeua:8-2010&r=ecm |
By: | Miguel J. Marín; M. Teresa Rodríguez-Bernal |
Abstract: | Multiple testing analysis, based on clustering methodologies, is usually applied in Microarray Data Analysis for comparisons between pair of groups. In this paper, we generalize this methodology to deal with multiple comparisons among more than two groups obtained from microarray expressions of genes. Assuming normal data, we define a statistic which depends on sample means and sample variances, distributed as a non-central t-distribution. As we consider multiple comparisons among groups, a mixture of non-central t-distributions is derived. The estimation of the components of mixtures is obtained via a Bayesian approach, and the model is applied in a multiple comparison problem from a microarray experiment obtained from gorilla, bonobo and human cultured fibroblasts. |
Keywords: | Clustering, MCMC computation, Microarray analysis, Mixture distributions, Multiple hypothesis testing, Non-central t-distribution |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:ws104427&r=ecm |
By: | Liu-Evans, Gareth |
Abstract: | A new methodology is presented for approximating the moments of least squares coefficient estimators in situations where endogeneity and dynamics are present. The OLS estimator is the focus here, but the method, which is valid under a simple set of smoothness and moment conditions, can be applied to related estimators. An O(T−1) approximation is presented for the bias in OLS estimation of a general ARX(p) model. |
Keywords: | moment approximation; bias; finite sample |
JEL: | C13 C01 |
Date: | 2010–11–09 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:26550&r=ecm |
By: | Elena Krasnokutskaya (Department of Economics, University of Pennsylvania) |
Abstract: | This paper investigates the empirical importance of allowing for multi-dimensional sources of unobserved heterogeneity in auction models with private information. It in turn develops the estimation procedure that recovers the distribution of private information in the presence of two distinct sources of unobserved heterogeneity. It is shown that this estimation procedure identifies components of the model and produces uniformly consistent estimators of these components. The estimation procedure is applied to the data from highway procurement. The results of the estimation indicate that allowing for two-dimensional unobserved heterogeneity may significantly affect the results of estimation as well as policy-relevant instruments derived from the estimated distributions of bidders’ costs. |
Keywords: | unobserved auction heterogeneity, procurement auctions, reserve price |
JEL: | L0 L1 L2 L8 M3 |
Date: | 2010–05–03 |
URL: | http://d.repec.org/n?u=RePEc:pen:papers:10-036&r=ecm |
By: | Tinkl, Fabian |
Abstract: | We generalize the results for statistical functionals given by [Fernholz, 1983] and [Serfling, 1980] to M estimates for samples drawn for an ergodic and stationary martingale sequence. In a first step, we take advantage of some recent results on the uniform convergency of the empirical distribution given by [Adams & Nobel, 2010] to prove consistency of M estimators, before we assume Hadamard differentiability of our estimators to prove their asymptotic normality. Further we apply the results to the LAD estimator of [Peng & Yao, 2003] and the maximum-likelihood estimator for GARCH processes to show the wide field of possible applications of this method. -- |
Keywords: | Hadamard differential,M estimator,von Mises Calculus,martingale differences,GARCH models |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:zbw:iwqwdp:092010&r=ecm |
By: | Jouchi Nakajima (Department of Statistical Science, Duke University); Yasuhiro Omori (Faculty of Economics, University of Tokyo) |
Abstract: | This paper represents empirical studies of SV models with a generalized hyperbolic (GH) skew Student's t-error distribution to embed both asymmetric heavy-tailness and leverage effects for financial time series. An efficient Markov chain Monte Carlo estimation method is described and the model is fit to daily S&P500 stock returns. The practical importance of the proposed model is highlighted through the model comparison based on the marginal likelihood, Value at Risk (VaR) and expected shortfall. The empirical results show that incorporating leverage and asymmetric heavy-tailness contributes to the model fit and predicting the expected shortfall. |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:tky:jseres:2010cj228&r=ecm |
By: | Steve Gibbons; Henry G. Overman |
Abstract: | We argue that identification problems bedevil most applied spatial research. Spatialeconometrics solves these problems by deriving estimators assuming that functional formsare known and by using model comparison techniques to let the data choose betweencompeting specifications. We argue that in most situations of interest this, at best, achievesonly very weak identification. Worse, in most cases, such an approach will simply beuninformative about the economic processes at work rendering much applied spatialeconometric research 'pointless', unless the main aim is simply description of the data. Weadvocate an alternative approach based on the 'experimental paradigm' which puts issues ofidentification and causality at centre stage. |
Keywords: | statistical methods, spatial, modeling |
JEL: | C1 C12 C21 R00 R15 |
Date: | 2010–10 |
URL: | http://d.repec.org/n?u=RePEc:cep:sercdp:0061&r=ecm |
By: | Renee Fry (ANU); Adrian Pagan (QUT/UTS) |
Abstract: | The paper provides a review of the estimation of structural VARs with sign restrictions. It is shown how sign restrictions solve the parametric identification problem present in structural systems but leave the model identification problem unresolved. A market and a macro model are used to illustrate these points. Suggestions have been made on how to find a unique model. These are reviewed, along with some of the difficulties that can arise in how one is to use the impulse responses found with sign restrictions. |
Keywords: | Structural Vector Autoregressions, New Keynesian Model, Sign Restrictions |
JEL: | E32 C51 C32 |
Date: | 2010–07–28 |
URL: | http://d.repec.org/n?u=RePEc:qut:auncer:2010_04&r=ecm |
By: | Fabio Sigrist; Werner A. Stahel |
Abstract: | Regression models for limited continuous dependent variables having a non-negligible probability of attaining exactly their limits are presented. The models differ in the number of parameters and in their flexibility. It is shown how to fit these models and they are applied to a Loss Given Default dataset from insurance to which they provide a good fit. |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1011.1796&r=ecm |
By: | Jennifer L. Castle; Jurgen A. Doornik; David F. Hendry; Ragnar Nymoen |
Abstract: | The new-Keynesian Phillips curve (NKPC) includes expected future inflation as a major feedforward variable to explain current inflation. Models of this type are regularly estimated by replacing the expected value by the actual future outcome, then using Instrumental Variables or Generalized Method of Moments methods to estimate the parameters. However, the underlying theory does not allow for various forms of non-stationarity in the data - despite the fact that crises, breaks and regimes shifts are relatively common. We investigate the consequences for NKPC estimation of breaks in data processes using the new technique of impulse-indicator saturation, and apply the resulting methods to salient published studies to check their viablility. |
Keywords: | New Keynesian Phillips curve, inflation expectations, structural breaks, impulse-indicator, saturation |
JEL: | C51 C22 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:oxf:wpaper:510&r=ecm |
By: | Weron, Rafal; Janczura, Joanna |
Abstract: | In this paper we discuss the calibration issues of models built on mean-reverting processes combined with Markov switching. Due to the unobservable switching mechanism, estimation of Markov regime-switching (MRS) models requires inferring not only the model parameters but also the state process values at the same time. The situation becomes more complicated when the individual regimes are independent from each other and at least one of them exhibits temporal dependence (like mean reversion in electricity spot prices). Then the temporal latency of the dynamics in the regimes as to be taken into account. In this paper we propose a method that greatly reduces the computational burden induced by the introduction of independent regimes in MRS models. We perform a simulation study to test the efficiency of the proposed method and apply it to a sample series of wholesale electricity spot prices from the German EEX market. The proposed 3-regime MRS model fits this data well and also contains unique features that allow for useful interpretations of the price dynamics. |
Keywords: | Markov regime-switching; heteroskedasticity; EM algorithm; independent regimes; electricity spot price |
JEL: | C51 C13 Q40 |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:26628&r=ecm |
By: | Bell, Peter N |
Abstract: | The new methodology to study the impact of corporate events on bonds is comprised of a sampling technique and regression model. The method is different from standard approaches, motivated by the belief that event impact should be reflected in levels of yield premium. The regression tests for a change in average bond price after an event, statistical inference is made by estimates of a dummy variable. A new sampling method is described to accommodate the irregular spacing of bond trades in time. |
Keywords: | Event Study; Bonds; TRACE; ANOVA |
JEL: | G14 G3 |
Date: | 2010–11–15 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:26694&r=ecm |
By: | Kevin E. Staub (Socioeconomic Institute, University of Zurich) |
Abstract: | The usual decomposition of effects in corner solution models into extensive and intensive margins is generally incompatible with a causal interpretation. This paper proposes a decomposition based on the joint distribution of potential outcomes which is meaningful in a causal sense. The difference between decompositions can be substantial and yield diametrically opposed results, as shown in a standard Tobit model example. In a generalized Tobit application exploring the effect of reducing firm entry regulation on bilateral trade flows between countries, estimates suggest that using the usual decomposition would overstate the contribution of the extensive margin by around 15%. |
Keywords: | Limited dependent variables, potential outcomes, causality, conditional-on-positives effect, Tobit, two-part model, country margins of trade |
JEL: | C24 C34 F14 |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:soz:wpaper:1012&r=ecm |
By: | Timo Mitze |
Abstract: | For spatial data with a sufficiently long time dimension, the concept of global cointegration has been recently included in the econometrics research agenda. Global cointegration arises when non-stationary time series are cointegrated both within and between spatial units. In this paper, we analyze the role of globally cointegrated variable relationships using German regional data (NUTS 1 level) for GDP, trade, and FDI activity during the period 1976–2005. Applying various homogeneous and heterogeneous panel data estimators to a Spatial Panel Error Correction Model (SpECM) for regional output growth allows us to analyze the short- and long-run impacts of internationalization activities. For the long-run cointegration equation, the empirical results support the hypothesis of export- and FDI-led growth. We also show that for export and outward FDI activity positive cross-regional eff ects are at work. Likewise, in the short-run SpECM specification, direct and indirect spatial externalities are found to be present. As a sensitivity analysis, we use a spatial weighting matrix based on interregional goods transport fl ows rather than geographical distances. This scheme thus allows us to address more soundly the role of positive and negative effects of trade/FDI on output activity for a system of interconnected regions. |
Keywords: | Cointegration; Spatial Durbin model; growth; trade; FDI |
JEL: | C21 C22 C23 F43 |
Date: | 2010–11 |
URL: | http://d.repec.org/n?u=RePEc:rwi:repape:0222&r=ecm |