nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒10‒22
fifteen papers chosen by
Sune Karlsson
Orebro University

  1. Maximum Likelihood Estimation and Uniform Inference with Sporadic Identification Failure By Donald W. K. Andrews; Xu Cheng
  2. Multivariate trend comparisons between autocorrelated climate series with general trend regressors By Ross McKitrick; Timothy Vogelsang
  3. Parametric Conditional Monte Carlo Density Estimation By Yin Liao; John Stachurski
  4. Incorporating theoretical restrictions into forecasting by projection methods By Giacomini, Raffaella; Ragusa, Giuseppe
  5. On the finite-sample properties of conditional empirical likelihood estimators By Crudu, Federico; Sándor, Zsolt
  6. Identification and Inference with Many Invalid Instruments By Michal Kolesár; Raj Chetty; John N. Friedman; Edward L. Glaeser; Guido W. Imbens
  7. The Hodrick-Prescott filter with priors: linear restrictions on HP filters By Julio, Juan Manuel
  8. An Order-Theoretic Mixing Condition for Monotone Markov Chains By Takashi Kamihigashi; John Stachurski
  9. A GOODNESS OF FIT TEST FOR ERGODIC MARKOV PROCESSES By Vance Martin; Yoshihiko Nishiyama; John Stachurski
  10. Markov Switching Models in Empirical Finance By Massimo Guidolin
  11. Generalized Look-Ahead Methods for Computing Stationary Densities By R. Anton Braun; Huiyu Li; John Stachurski
  12. On Bartlett Correctability of Empirical Likelihood in Generalized  Power Divergence Family By Lorenzo Camponovo; Taisuke Otsu
  13. Stability of Stationary Distributions in Monotone Economies By Takashi Kamihigashi; John Stachurski
  14. Modeling Financial Crises Mutation By Elena-Ivona Dumitrescu; Bertrand Candelon; Christophe Hurlin; Franz C. Palm
  15. From Wald to Savage: homo economicus becomes a Bayesian statistician By Giocoli, Nicola

  1. By: Donald W. K. Andrews (Cowles Foundation, Yale University); Xu Cheng (Dept. of Economics, University of Pennsylvania)
    Abstract: This paper analyzes the properties of a class of estimators, tests, and confidence sets (CS's) when the parameters are not identified in parts of the parameter space. Specifically, we consider estimator criterion functions that are sample averages and are smooth functions of a parameter theta. This includes log likelihood, quasi-log likelihood, and least squares criterion functions. We determine the asymptotic distributions of estimators under lack of identification and under weak, semi-strong, and strong identification. We determine the asymptotic size (in a uniform sense) of standard t and quasi-likelihood ratio (QLR) tests and CS's. We provide methods of constructing QLR tests and CS's that are robust to the strength of identification. The results are applied to two examples: a nonlinear binary choice model and the smooth transition threshold autoregressive (STAR) model.
    Keywords: Asymptotic size, Binary choice, Confidence set, Estimator, Identification, Likelihood, Nonlinear models, Test, Smooth transition threshold autoregression, Weak identification
    JEL: C12 C15
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1824&r=ecm
  2. By: Ross McKitrick (Department of Economics and Finance, University of Guelph); Timothy Vogelsang (Department of Economics, Michigan State University)
    Abstract: Inference regarding trends in climatic data series, including comparisons across different data sets as well as univariate trend significance tests, is complicated by the presence of serial correlation and step-changes in the mean. We review recent developments in the estimation of heteroskedasticity and autocorrelation robust (HAC) covariance estimators as they have been applied to linear trend inference, with focus on the Vogelsang-Franses (2005) nonparametric approach, which provides a unified framework for trend covariance estimation robust to unknown forms of autocorrelation up to but not including unit roots, making it especially useful for climatic data applications. We extend the Vogelsang-Franses approach to allow general deterministic regressors including the case where a step-change in the mean occurs at a known date. Additional regressors change the critical values of the Vogelsang-Franses statistic. We derive an asymptotic approximation that can be used to simulate critical values. We also outline a simple bootstrap procedure that generates valid critical values and p-values. The motivation for extending the Vogelsang-Franses approach is an application that compares climate model generated and observational global temperature data in the tropical lower- and mid-troposphere from 1958 to 2010. Inclusion of a mean shift regressor to capture the Pacific Climate Shift of 1977 causes apparently significant observed trends to become statistically insignificant, and rejection of the equivalence between model generated and observed data trends occurs for much smaller significance levels (i.e. is more strongly rejected).
    Keywords: Autocorrelation; trend estimation; HAC variance matrix; global warming; model comparisons
    JEL: C14 C32 C52 Q54
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:gue:guelph:2011-09.&r=ecm
  3. By: Yin Liao; John Stachurski
    Abstract: In applied density estimation problems, one often has data not only on the target variable, but also on a collection of covariates. In this paper, we study a density estimator that incorporates this additional information by combining parametric estimation and conditional Monte Carlo. We prove an approximate functional asymptotic normality result that illustrates convergence rates and the asymptotic variance of the estimator. Through simulation, we illustrate the strength of its finite sample properties in a number of standard econometric and financial applications.
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2011-562&r=ecm
  4. By: Giacomini, Raffaella; Ragusa, Giuseppe
    Abstract: We propose a method for modifying a given density forecast in a way that incorporates the information contained in theory-based moment conditions. An example is "improving" the forecasts from atheoretical econometric models, such as factor models or Bayesian VARs, by ensuring that they satisfy theoretical restrictions given for example by Euler equations or Taylor rules. The method yields a new density (and thus point-) forecast which has a simple and convenient analytical expression and which by construction satisfies the theoretical restrictions. The method is flexible and can be used in the realistic situation in which economic theory does not specify a likelihood for the variables of interest, and thus cannot be readily used for forecasting.
    Keywords: Bayesian VAR; Euler conditions; Exponential tilting; Forecast comparisons
    JEL: C53
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8604&r=ecm
  5. By: Crudu, Federico; Sándor, Zsolt
    Abstract: We provide Monte Carlo evidence on the finite sample behavior of the conditional empirical likelihood (CEL) estimator of Kitamura, Tripathi, and Ahn (2004) and the conditional Euclidean empirical likelihood (CEEL) estimator of Antoine, Bonnal, and Renault (2007) in the context of a heteroskedastic linear model with an endogenous regressor. We compare these estimators with three heteroskedasticity-consistent instrument-based estimators in terms of various performance measures. Our results suggest that the CEL and CEEL with fixed bandwidths may suffer from the no-moment problem, similarly to the unconditional generalized empirical likelihood estimators studied by Guggenberger (2008). We also study the CEL and CEEL estimators with automatic bandwidths selected through cross-validation. We do not find evidence that these suffer from the no-moment problem. When the instruments are weak, we find CEL and CEEL to have finite sample properties --in terms of mean squared error and coverage probability of confidence intervals-- poorer than the heteroskedasticity-consistent Fuller (HFUL) estimator. In the strong instruments case the CEL and CEEL estimators with automatic bandwidths tend to outperform HFUL in terms of mean squared error, while the reverse holds in terms of the coverage probability, although the differences in numerical performance are rather small.
    Keywords: Conditional empirical likelihood; conditional Euclidean likelihood; heteroskedasticity; weak instruments; cross-validation
    JEL: C13 C14 C15 C30
    Date: 2011–09–23
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34116&r=ecm
  6. By: Michal Kolesár; Raj Chetty; John N. Friedman; Edward L. Glaeser; Guido W. Imbens
    Abstract: We analyze linear models with a single endogenous regressor in the presence of many instrumental variables. We weaken a key assumption typically made in this literature by allowing all the instruments to have direct effects on the outcome. We consider restrictions on these direct effects that allow for point identification of the effect of interest. The setup leads to new insights concerning the properties of conventional estimators, novel identification strategies, and new estimators to exploit those strategies. A key assumption underlying the main identification strategy is that the product of the direct effects of the instruments on the outcome and the effects of the instruments on the endogenous regressor has expectation zero. We argue in the context of two specific examples with a group structure that this assumption has substantive content.
    JEL: C01 C2
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:17519&r=ecm
  7. By: Julio, Juan Manuel
    Abstract: A closed formula for the Hodrick & Prescott, HP, filter subject to linear restrictions is derived. This filter is also known as the HP filter with priors. When the formula is applied to the ordinary HP filter linear restrictions apply only within the sample. However, when this formula is applied to the extended HP filter and extensions that correct for GDP revisions and delays, linear restrictions apply out of sample also.
    Keywords: Business Cycles, Hodrick-Prescott Filter
    JEL: E32 C22
    Date: 2011–10–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34202&r=ecm
  8. By: Takashi Kamihigashi; John Stachurski
    Abstract: We discuss stability of discrete-time Markov chains satisfying monotonicity and an order-theoretic mixing condition that can be seen as an alternative to irreducibility. A chain satisfying these conditions has at most one stationary distribution. Moreover, if there is a stationary distribution, then the chain is stable in an order-theoretic sense.
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2011-559&r=ecm
  9. By: Vance Martin (Department of Economics, The University of Melbourne); Yoshihiko Nishiyama (Institute of Economic Research, Kyoto University); John Stachurski (Research School of Economics, Australian National University)
    Abstract: We introduce a goodness of fit test for ergodic Markov processes. Our test compares the data against the set of stationary densities implied by the class of models specified in the null hypothesis, and rejects if no model in the class yields a stationary density that matches with the data. No alternative needs to be specified in order to implement the test. Although our test compares densities it involves no smoothing parameters, and is powerful against 1√n local alternatives.
    Keywords: Specification test, goodness of fit, Markov processes.
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:787&r=ecm
  10. By: Massimo Guidolin
    Abstract: I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in the light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns. JEL Classification Codes: G00, C00. Keywords: Markov switching, Regimes, Regime shifts, Nonlinearities, Predictability, Autoregressive Conditional Heteroskedasticity.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:igi:igierp:415&r=ecm
  11. By: R. Anton Braun; Huiyu Li; John Stachurski
    Abstract: The look-ahead estimator is used to compute densities associated with Markov processes via simulation. We study a framework that extends the look-ahead estimator to a much broader range of applications. We provide a general asymptotic theory for the estimator, where both L1 consistency and L2 asymptotic normality are established.
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2011-558&r=ecm
  12. By: Lorenzo Camponovo (Dept. of Economics, University of Lugano); Taisuke Otsu (Cowles Foundation, Yale University)
    Abstract: Baggerly (1998) showed that empirical likelihood is the only member in the Cressie-Read power divergence family to be Bartlett correctable. This paper strengthens Baggerly's result by showing that in a generalized class of the power divergence family, which includes the Cressie-Read family and other nonparametric likelihood such as Schennach's (2005, 2007) exponentially tilted empirical likelihood, empirical likelihood is still the only member to be Bartlett correctable. 
    Keywords: Bartlett correction, Empirical likelihood, Cressie-Read power divergence family
    JEL: C12 C14
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1825&r=ecm
  13. By: Takashi Kamihigashi; John Stachurski
    Abstract: This paper generalizes the sufficient conditions for stability of monotone economies and time series models due to Hopenhayn and Prescott (Econometrica, 60, p. 1387–1406, 1992). We introduce a new order-theoretic mixing condition and characterize stability for monotone economies satisfying this condition. We also provide a range of results that can be used to verify our mixing condition in applications, as well as the other components of our main stability theorem. Through this approach, we extend Hopenhayn and Prescott’s method to a significantly larger class of problems, and develop new perspectives on the causes of instability and stability.
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2011-561&r=ecm
  14. By: Elena-Ivona Dumitrescu (LEO - Laboratoire d'économie d'Orleans - CNRS : UMR6221 - Université d'Orléans); Bertrand Candelon (Economics - Maastricht University); Christophe Hurlin (LEO - Laboratoire d'économie d'Orleans - CNRS : UMR6221 - Université d'Orléans); Franz C. Palm (Maastricht University - univ. Maastricht)
    Abstract: The recent financial turmoils in Latin America and Europe have led to a concatenation of several events from currency, banking and sovereign debt crises. This paper proposes a multivariate dynamic probit model that encompasses the three types of crises 'currency, banking and sovereign debt' and allows us to investigate the potential causality between all three crises. To achieve this objective, we propose a methodological novelty consisting of an exact maximum likelihood method to estimate this multivariate dynamic probit model, extending thus Huguenin, Pelgrin and Holly (2009). Using a large sample of data for emerging countries, which experienced financial crises, we find that mutations from banking to currency (and vice-versa) are quite common. More importantly, the trivariate model turns out to be more parsimonious in the case of the two countries which suff ered from the 3 types of crises. These findings are strongly confi rmed by a conditional probability and an impulse-response function analysis, highlighting the interaction between the di fferent types of crises and advocating hence the implementation of trivariate models whenever it is feasible.
    Keywords: Financial crisis, Multivariate dynamic probit models, Emerging countries
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00630036&r=ecm
  15. By: Giocoli, Nicola
    Abstract: Bayesian rationality is the paradigm of rational behavior in neoclassical economics. A rational agent in an economic model is one who maximizes her subjective expected utility and consistently revises her beliefs according to Bayes’s rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is far from trivial and of great historiographic importance. The story begins with Abraham Wald’s behaviorist approach to statistics and culminates with Leonard J. Savage’s elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. It is the latter’s acknowledged fiasco to achieve its planned goal, the reinterpretation of traditional inferential techniques along subjectivist and behaviorist lines, which raises the puzzle of how a failed project in statistics could turn into such a tremendous hit in economics. A couple of tentative answers are also offered, involving the role of the consistency requirement in neoclassical analysis and the impact of the postwar transformation of US business schools.
    Keywords: Savage; Wald; rational behavior; Bayesian decision theory; subjective probability; minimax rule; statistical decision functions; neoclassical economics
    JEL: B31 B21 D81
    Date: 2011–10–14
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34117&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.