|
on Econometrics |
By: | Bertille Antoine (Simon Fraser University); Eric Renault |
Abstract: | This paper extends the asymptotic theory of GMM inference to allow sample counterparts of the estimating equations to converge at (multiple) rates, different from the usual square-root of the sample size. In this setting, we provide consistent estimation of the structural parameters. In addition, we define a convenient rotation in the parameter space (or reparametrization) to disentangle the different rates of convergence. More precisely, we identify special linear combinations of the structural parameters associated with a specific rate of convergence. Finally, we demonstrate the validity of usual inference procedures, like the overidentification test and Wald test, with standard formulas. It is important to stress that both estimation and testing work without requiring the knowledge of the various rates. However, the assessment of these rates is crucial for (asymptotic) power considerations. Possible applications include econometric problems with two dimensions of asymptotics, due to trimming, tail estimation, infill asymptotic, social interactions, kernel smoothing or any kind of regularization. |
Keywords: | GMM; Mixed-rates asymptotics; Kernel estimation; Rotation in the coordinate system |
JEL: | C32 C12 C13 C51 |
Date: | 2012–03 |
URL: | http://d.repec.org/n?u=RePEc:sfu:sfudps:dp12-03&r=ecm |
By: | Bertille Antoine (Simon Fraser University); Eric Renault |
Abstract: | We consider a general framework where weaker patterns of identifcation may arise: typically, the data generating process is allowed to depend on the sample size. However, contrary to what is usually done in the literature on weak identification, we do not give up the efficiency goal of statistical inference: even fragile information should be processed optimally for the purpose of both efficient estimation and powerful testing. Our main contribution is actually to consider that several patterns of identification may arise simultaneously. This heterogeneity of identification schemes paves the way for the device of optimal strategies for inferential use of information of poor quality. More precisely, we focus on a case where asymptotic efficiency of estimators is well-defined through the variance of asymptotically normal distributions. Standard efficient estimation procedures still hold, albeit with rates of convergence slower than usual. We stress that these are feasible without requiring the prior knowledge of the identification schemes. |
Keywords: | Instrumental variable; Weak instrument; GMM |
JEL: | C51 C13 C12 C32 |
Date: | 2012–03 |
URL: | http://d.repec.org/n?u=RePEc:sfu:sfudps:dp12-04&r=ecm |
By: | Eo, Yunjong |
Abstract: | I propose a Bayesian approach to making an inference about complicated patterns of structural breaks in time series. Structural break models in the literature are mainly considered for a simple case in which all the parameters under the structural changes are restricted to have breaks at the same dates. Unlike the existing literature, the proposed method in this paper allows multiple parameters such as intercept, persistence, and/or residual variance to undergo mutually independent structural breaks at different dates with the different number of breaks across parameters. To estimate the complex structural break models considered in this paper, structural breaks in the multiple parameters are interpreted as regime transitions as in Chib (1998). The regime for each parameter is then indicated by a corresponding discrete latent variable which follows a first-order Markov process. A Markov-chain Monte Carlo scheme is developed to estimate and compare the complex structural break models, which are potentially non-nested, in an efficient and tractable way. I apply this approach to postwar U.S. inflation and find strong support for an autoregressive model with two structural breaks in residual variance and no break in intercept and persistence. |
Keywords: | Inflation Dynamics; Multiple-Parameter Change-point; Structural Breaks; Bayesian Analysis |
Date: | 2012–02 |
URL: | http://d.repec.org/n?u=RePEc:syd:wpaper:2123/8149&r=ecm |
By: | Khmaladze, E.V. (Tilburg University, Center for Economic Research) |
Abstract: | Abstract: The paper presents an extension of K.Pearson's approach to testing via his chi-square statistics. |
Keywords: | Components of chi-square statistics;unitary transformations;projected empirical processes;empirical processes in Rm. |
JEL: | C12 C14 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:dgr:kubcen:2012028&r=ecm |
By: | Chalabi, Yohan / Y.; Scott, David J; Wuertz, Diethelm |
Abstract: | The generalized lambda distribution (GLD) is a versatile distribution that can accommodate a wide range of shapes, including fat-tailed and asymmetric distributions. It is defined by its quantile function. We introduce a more intuitive parameterization of the GLD that expresses the location and scale parameters directly as the median and inter-quartile range of the distribution. The remaining two shape parameters characterize the asymmetry and steepness of the distribution respectively. This is in contrasts to the previous parameterizations where the asymmetry and steepness are described by the combination of the two tail indices. The estimation of the GLD parameters is notoriously difficult. With our parameterization, the fitting of the GLD to empirical data can be reduced to a two-parameter estimation problem where the location and scale parameters are estimated by their robust sample estimators. This approach also works when the moments of the GLD do not exist. Moreover, the new parameterization can be used to compare data sets in a convenient asymmetry and steepness shape plot. In this paper, we derive the new formulation, as well as the conditions of the various distribution shape regions and moment conditions. We illustrate the use of the asymmetry and steepness shape plot by comparing equities from the NASDAQ-100 stock index. |
Keywords: | Quantile distributions; generalized lambda distribution; shape plot representation |
JEL: | C16 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:37814&r=ecm |
By: | Georgiadis, Georgios |
Abstract: | In the panel conditionally homogenous vectorautoregressive model, the cross-sectional units' dynamics are generally heterogenous, but homogenous if units share the same structural characteristics. The panel conditionally homogenous vectorautoregressive model thus allows (i) to account for heterogeneity in dynamic panel data sets, (ii) to nevertheless exploit the panel nature of the data, and (iii) to analyze the relationship between the units' observed heterogeneities and structural characteristics. I show how standard least squares estimation can be applied, how impulse responses can be computed, how multivariate conditioning is implemented, and how polynomial order restrictions can be incorporated. Finally, I present an easy-to-use Matlab routine which can be used to estimate the panel conditionally homogenous vectorautoregressive model and produce impulse responses as well as forecast error variance decompositions. |
Keywords: | Panel VAR; Heterogeneity; Conditional Pooling |
JEL: | C51 C33 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:37755&r=ecm |
By: | Yuta Kurose (Graduate School of Economics, University of Tokyo); Yasuhiro Omori (Faculty of Economics, University of Tokyo) |
Abstract: | A smoothing spline is considered to propose a novel model for the time-varying quantile of the univariate time series using a state space approach. A correlation is further incorporated between the dependent variable and its one-step-ahead quantile. Using a Bayesian approach, an efficient Markov chain Monte Carlo algorithm is described where we use the multi-move sampler, which generates simultaneously latent time-varying quantiles. Numerical examples are provided to show its high sampling efficiency in comparison with the simple algorithm that generates one latent quantile at a time given other latent quantiles. Furthermore, using Japanese inflation rate data, an empirical analysis is provided with the model comparison. </table> |
Date: | 2012–03 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2012cf845&r=ecm |
By: | Gerd Ronning |
Abstract: | Empirical research using micro data via remote access has been advocated in recent time by statistical offices since confidentiality is easier warranted for this approach. However, disclosure of single values and units cannot be completely avoided. Binary regressors (dummy vari- ables) bear a high risk of disclosure, especially if their interactions are considered as it is done by definition in saturated models. However, contrary to views expressed in earlier publications the risk is only existing if besides parameter estimates also predicted values are reported to the researcher. The paper considers saturated specifications of the most popular linear and nonlinear microeconometric models and shows that in all cases the disclosure risk is high if some design points are represented by a (very) small number of observations. For two of the models not belonging to the exponential family (probit model and negative binomial regression model) we show that the same estimates of the conditional expectations arise here although the parameter estimates are defined by a modified equation. In the last section we draw at- tention to the fact that interaction of binary regressors can be used to construct "strategic dummy variables" which lead to hight disclosure risk as shown, for example, in Bleninger et al. (2010) for the linear model. In this paper we extend the analysis to the set of established nonlinear models, in particular logit, probit and count data models. |
Keywords: | logit model, probit model, poisson regression, negative binomial regression model, strategic dummy variable, tabular data |
Date: | 2011–06 |
URL: | http://d.repec.org/n?u=RePEc:iaw:iawdip:72&r=ecm |
By: | Peter Reinhard HANSEN; Allan TIMMERMANN |
Abstract: | Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be di cult to interpret, particularly when several values of the split point might have been considered. When the sample split is viewed as a choice variable, rather than being fixed ex ante, we show that very large size distortions can occur for conventional tests of predictive accuracy. Spurious rejections are most likely to occur with a short evaluation sample, while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictability of stock returns and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined. |
Keywords: | C12; C53 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:eui:euiwps:eco2012/10&r=ecm |
By: | Mardi Dungey; Jan PAM Jacobs; Jing Tian; Simon van Norden |
Abstract: | This paper places the data revision model of Jacobs and van Norden (2011) within a class of trend-cycle decompositions relating directly to the Beveridge-Nelson decomposition. In both these approaches identifying restrictions on the covariance matrix under simple and realistic conditions may produce a smoothed estimate of the underlying series which is more volatile than the observed series. |
JEL: | C22 C53 C82 |
Date: | 2012–03 |
URL: | http://d.repec.org/n?u=RePEc:acb:camaaa:2012-16&r=ecm |
By: | Coroneo, Laura (University of Manchester, Economics - School of Social Sciences); Corradi, Valentina (University of Warwick, Department of Economics); Santos Monteiro, Paulo (University of Warwick, Department of Economics) |
Abstract: | The specification of an optimizing model of the monetary transmission mechanism requires selecting a policy regime, commonly commitment or discretion. In this paper we propose a new procedure for testing optimal monetary policy, relying on moment inequalities that nest commitment and discretion as two special cases. The approach is based on the derivation of bounds for inflation that are consistent with optimal policy under either policy regime. We derive testable implications that allow for specification tests and discrimination between the two alternative regimes. The proposed procedure is implemented to examine the conduct of monetary policy in the United States economy. Key words: Bootstrap ; GMM ; Moment Inequalities ; Optimal Monetary Policy. JEL Classification: C12 ; C52 ; E52 ; E58 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:wrk:warwec:985&r=ecm |
By: | Klein, Ingo; Ardelean, Vlad |
Abstract: | Li, Fang & Tian (1994) assert that special quasi-linear means should be preferred to the simple arithmetic mean for robustness properties. The strategy that is used to show robustness is completely detached from the concepts wellknown from the theory of robust statistics. Robustness of estimators can be verified with tools from robust statistics, e.g. the influence function or the breakdown point. On the other hand it seems that robust statistics is not interested in quasi-linear means. Therefore, we compute influence functions and breakdown points for quasi-linear means and show that these means are not robust in the sense of robust statistics if the generator is unbounded. As special cases we consider the Laspeyres, the Paasche and the Fisher indices. -- |
Keywords: | quasi-linear mean,robustness,influence function,breakdown point,Laspeyres index,Paasche index,Fisher index |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:zbw:faucse:882010&r=ecm |
By: | G.M. Gallo; Edoardo Otranto |
Abstract: | The empirical evidence behind the dynamics of high frequency based measures of volatility is that they exhibit persistence and at times abrupt changes in the average level by subperiods. In the past ten years this pattern has a clear interpretation in reference to the dot com bubble, the quiet period of expansion of credit and then the harsh times after the burst of the subprime mortgage crisis. We conjecture that the inadequacy of many econometric volatility models (a very high level of estimated persistence, serially correlated residuals) can be solved with an adequate representation of such a pattern. We insert a Markovian dynamics in a Multiplicative Error Model to represent the conditional expectation of the realized volatility, allowing us to address the issues of a slow moving average level of volatility and of a different dynamics across regime. We apply the model to realized volatility of the S&P500 index and we gauge the usefulness of such an approach by a more interpretable persistence, better residual properties, and an increased goodness of fit. |
Keywords: | MEM models; regime switching; realized volatility; volatility persistence |
JEL: | C22 C24 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:cns:cnscwp:201205&r=ecm |
By: | Li Su (MRC Biostatistics Unit, Cambridge, UK); Sarah Brown (Department of Economics, The University of Sheffield); Pulak Ghosh (Department of Quantitative Methods and Information Systems, Indian Institute of Management at Bangalore, India); Karl Taylor (Department of Economics, The University of Sheffield) |
Abstract: | In this paper, we contribute to the empirical literature on household finances by introducing a Bayesian bivariate two-part model. With correlated random effects, the proposed approach allows for the potential interdependence between the holding of assets and debt at the household level and also encompasses a two-part process to allow for differences in the influences of the independent variables on the decision to hold debt or assets and the influences of the independent variables on the amount of debt or assets held. Finally, we also incorporate joint modelling of household size into the framework to allow for the fact that the debt and asset information is collected at the household level and hence household size may be strongly correlated with household debt and assets. Our findings endorse our joint modelling approach and, furthermore, confirm that certain explanatory variables exert different influences on the binary and continuous parts of the model. |
Keywords: | Assets; Bayesian Approach; bridge distribution; debt; two-Part model |
JEL: | C11 C33 D14 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:shf:wpaper:2012009&r=ecm |
By: | Eo, Yunjong; Kim, Chang-Jin |
Abstract: | In this paper, we relax the assumption of constant regime-specific mean growth rates in Hamilton's (1989) two-state Markov-switching model of the business cycle. We first present a benchmark model, in which each regime-specific mean growth rate evolves according to a random walk process over different episodes of booms or recessions. We then present a model with vector error correction dynamics for the regime-specific mean growth rates, by deriving and imposing a condition for the existence of a long-run equilibrium growth rate for real output. In the Bayesian Markov Chain Monte Carlo (MCMC) approach developed in this paper, the counterfactual priors, as well as the hierarchical priors for the regime-specific parameters, play critical roles. By applying the proposed model and approach to the postwar real GDP growth data (1947Q4-2011Q3), we uncover the evolving nature of the regime-specific mean growth rates of real output in the U.S. business cycle. An additional feature of the postwar U.S. business cycle that we uncover is a steady decline in the long-run equilibrium output growth. The decline started in the mid-1950s and ended in the mid-1980s, coinciding with the beginning of the Great Moderation. Our empirical results also provide partial, if not decisive, evidence that the central bank has been more successful in restoring the economy back to its long-run equilibrium growth path after unusually severe recessions than after unusually good booms. |
Keywords: | State- Space Model; MCM; Hamilton Model; Markov Switching; Hierarchical Prior; Evolving Regime- Specific Parameters; Counterfactual Prior; Business Cycle; Bayesian Approach |
Date: | 2012–02 |
URL: | http://d.repec.org/n?u=RePEc:syd:wpaper:2123/8150&r=ecm |