nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒07‒14
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Nonlinear Regression with Harris Recurrent Markov Chains By Degui Li; Dag Tjøstheim; Jiti Gao
  2. Asymptotic Theory for Regressions with Smoothly Changing Parameters By Eric Hillebrand; Marcelo C. Medeiros; Junyue Xu
  3. Skew mixture models for loss distributions: a Bayesian approach By Bernardi, Mauro; Maruotti, Antonello; Lea, Petrella
  4. Testing for Nonparametric Identification of Causal Effects in the Presence of a Quasi-Instrument By de Luna, Xavier; Johansson, Per
  5. A simple method to visualize results in nonlinear regression models By Daniel J. Henderson; Subal C. Kumbhakar; Christopher F. Parmeter
  6. Nonlinearity, Breaks, and Long-Range Dependence in Time-Series Models By Eric Hillebrand; Marcelo C. Medeiros
  7. Fitting semiparametric Markov regime-switching models to electricity spot prices By Eichler Michael; Tuerk Dennis
  8. Signal extraction for nonstationary multivariate time series with illustrations for trend inflation By Tucker S. McElroy; Thomas M. Trimbur
  9. Small sample properties of matching with caliper By Paweł Strawiński
  10. A Generalized Missing-Indicator Approach to Regression with Imputed Covariates By Valentino Dardanoni; Giuseppe De Luca; Salvatore Modica; Franco Peracchi
  11. A Note on Particle Filters Applied to DSGE Models By Angelo Marsiglia Fasolo
  12. Mis-specification Testing: Non-Invariance of Expectations Models of Inflation By Jennifer L. Castle; Jurgen A. Doornik; David F. Hendry; Ragnar Nymoen
  13. Risk measures for Skew Normal mixtures By Bernardi, Mauro
  14. Econometric methods and Reichenbach's principle By Sean Muller

  1. By: Degui Li; Dag Tjøstheim; Jiti Gao
    Abstract: In this paper, we study parametric nonlinear regression under the Harris recurrent Markov chain framework. We first consider the nonlinear least squares estimators of the parameters in the homoskedastic case, and establish asymptotic theory for the proposed estimators. Our results show that the convergence rates for the estimators rely not only on the properties of the nonlinear regression function, but also on the number of regenerations for the Harris recurrent Markov chain. We also discuss the estimation of the parameter vector in a conditional volatility function and its asymptotic theory. Furthermore, we apply our results to the nonlinear regression with I(1) processes and establish an asymptotic distribution theory which is comparable to that obtained by Park and Phillips (2001). Some simulation studies are provided to illustrate the proposed approaches and results.
    Keywords: Asymptotic distribution, asymptotically homogeneous functions, ?-null recurrent Markov chains, Harris recurrence, integrable functions, least squares estimation, nonlinear regression.
    JEL: C13 C22
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2012-14&r=ecm
  2. By: Eric Hillebrand (Aarhus University and CREATES); Marcelo C. Medeiros (PONTIFICAL CATHOLIC UNIVERSITY OF RIO DE JANEIRO); Junyue Xu (LOUISIANA STATE UNIVERSITY)
    Abstract: We derive asymptotic properties of the quasi maximum likelihood estimator of smooth transition regressions when time is the transition variable. The consistency of the estimator and its asymptotic distribution are examined. It is shown that the estimator converges at the usual square-root-of-T rate and has an asymptotically normal distribution. Finite sample properties of the estimator are explored in simulations. We illustrate with an application to US inflation and output data.
    Keywords: Regime switching, smooth transition regression, asymptotic theory.
    JEL: C22
    Date: 2012–06–12
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-31&r=ecm
  3. By: Bernardi, Mauro; Maruotti, Antonello; Lea, Petrella
    Abstract: The derivation of loss distribution from insurance data is a very interesting research topic but at the same time not an easy task. To find an analytic solution to the loss distribution may be mislading although this approach is frequently adopted in the actuarial literature. Moreover, it is well recognized that the loss distribution is strongly skewed with heavy tails and present small, medium and large size claims which hardly can be fitted by a single analytic and parametric distribution. Here we propose a finite mixture of Skew Normal distributions that provides a better characterization of insurance data. We adopt a Bayesian approach to estimate the model, providing the likelihood and the priors for the all unknow parameters; we implement an adaptive Markov Chain Monte Carlo algorithm to approximate the posterior distribution. We apply our approach to a well known Danish fire loss data and relevant risk measures, as Value-at-Risk and Expected Shortfall probability, are evaluated as well.
    Keywords: Markov chain Monte Carlo; Bayesian analysis; mixture model; Skew-Normal distributions; Loss distribution; Danish data
    JEL: C52 C11 C01
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39826&r=ecm
  4. By: de Luna, Xavier (Umeå University); Johansson, Per (IFAU)
    Abstract: The identification of average causal effects of a treatment in observational studies is typically based either on the unconfoundedness assumption or on the availability of an instrument. When available, instruments may also be used to test for the unconfoundedness assumption (exogeneity of the treatment). In this paper, we define variables which we call quasi-instruments because they allow us to test for the unconfoundedness assumption although they do not necessarily yield nonparametric identification of the average causal effect. A quasi-instrument is defined as an instrument except for that its relation to the treatment is allowed to be confounded by unobservables, thereby resulting in a wider range of potential applications. We propose a test for the unconfoundedness assumption based on a quasi-instrument, and give conditions under which the test has power. We perform a simulation study and apply the results to a case study where the interest lies in evaluating the effect of job practice on employment.
    Keywords: testing, endogeneity, monotonicity, potential outcomes
    JEL: C26 C52
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp6692&r=ecm
  5. By: Daniel J. Henderson (Department of Economics, Finance and Legal Studies, University of Alabama); Subal C. Kumbhakar (Department of Economics, State University of New York); Christopher F. Parmeter (Department of Economics, University of Miami)
    Abstract: A simple graphical approach to presenting results from nonlinear regression models is described. In the face of multiple covariates, `partial mean' plots may be unattractive. The approach here is portable to a variety of settings and can be tailored to the specific application at hand. A simple four variable nonparametric regression example is provided to illustrate the technique.
    Keywords: Gradient Estimation;Dimensionality; Kernel Smoothing; Least Squares Cross Validation
    JEL: C1 C13 C14
    Date: 2012–04–30
    URL: http://d.repec.org/n?u=RePEc:mia:wpaper:2012-4&r=ecm
  6. By: Eric Hillebrand (Aarhus University and CREATES); Marcelo C. Medeiros (PONTIFICAL CATHOLIC UNIVERSITY OF RIO DE JANEIRO)
    Abstract: We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building procedures are proposed. The methodology is applied to stocks of the Dow Jones Industrial Average during the period 2000 to 2009. We find strong evidence of nonlinear effects.
    Keywords: Smooth transitions, long memory, forecasting, realized volatility.
    JEL: C22
    Date: 2012–06–12
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-30&r=ecm
  7. By: Eichler Michael; Tuerk Dennis (METEOR)
    Abstract: Recently regime-switching models have become the standard tool for modeling electricity prices.These models capture the main properties of electricity spot prices well but estimation of themodel parameters requires computer intensive methods. Moreover, the distribution of the pricespikes must be assumed given although the high volatility of the spikes makes it difficult tocheck this assumption. Consequently, there are a number of competing proposals. Alternatively wepropose the use of a semiparametric Markov regime-switching model that does not specify thedistribution under the spike regime. To estimate the model we use robust estimation techniques asan alternative to commonly applied estimation approaches. The model in combination with theestimation framework is easier to estimate, needs less computation time and distributionalassumptions. To show its advantages we compare the proposed model with a well establishedMarkov-switching model in a simulation-study. Further we apply the model to Australian logprices.The results are in accordance with the results from the simulation-study, indicating that theproposed model might be advantageous whenever the distribution of the spike process is notsufficiently known. The results are thus encouraging and suggest the use of our approach whenmodeling electricity prices and pricing derivatives.
    Keywords: econometrics;
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2012036&r=ecm
  8. By: Tucker S. McElroy; Thomas M. Trimbur
    Abstract: This paper advances the theory and methodology of signal extraction by introducing asymptotic and finite sample formulas for optimal estimators of signals in nonstationary multivariate time series. Previous literature has considered only univariate or stationary models. However, in current practice and research, econometricians, macroeconomists, and policy-makers often combine related series - that may have stochastic trends--to attain more informed assessments of basic signals like underlying inflation and business cycle components. Here, we use a very general model structure, of widespread relevance for time series econometrics, including flexible kinds of nonstationarity and correlation patterns and specific relationships like cointegration and other common factor forms. First, we develop and prove the generalization of the well-known Wiener-Kolmogorov formula that maps signal-noise dynamics into optimal estimators for bi-infinite series. Second, this paper gives the first explicit treatment of finite-length multivariate time series, providing a new method for computing signal vectors at any time point, unrelated to Kalman filter techniques; this opens the door to systematic study of near end-point estimators/filters, by revealing how they jointly depend on a function of signal location and parameters. As an illustration we present econometric measures of the trend in total inflation that make optimal use of the signal content in core inflation.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2012-45&r=ecm
  9. By: Paweł Strawiński (University of Warsaw, Faculty of Economic Sciences)
    Abstract: A caliper mechanism is a common tool used to prevent from inexact matches. The existing literature discusses asymptotic properties of matching with caliper. In this simulation study we investigate properties in small and medium sized samples. We show that caliper causes a significant bias of the ATT estimator and raises its variance in comparison to one-to-one matching.
    Keywords: propensity score matching, caliper, Monte Carlo experiment, finite sample properties
    JEL: C14 C21 C52
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:war:wpaper:2012-13&r=ecm
  10. By: Valentino Dardanoni (University of Palermo); Giuseppe De Luca (ISFOL); Salvatore Modica (University of Palermo); Franco Peracchi (Tor Vergata University and EIEF)
    Abstract: This paper considers estimation of a linear regression model using data where some covariate values are missing but imputations are available to fill-in the missing values. The availability of imputations generates a trade-off between bias and precision in the estimators of the regression parameters: the complete cases are often too few, so precision is lost, but filling-in the missing values with imputations may lead to bias. We provide the new Stata command gmi which allows handling such bias-precision trade-off using either model reduction or model averaging techniques in the context of the generalized missing-indicator approach recently proposed by Dardanoni et al.(2011). If multiple imputations are available, our gmi command can be also combined with the built-in Stata prefix mi estimate to account for the extra variability due to the imputation process. The gmi command is illustrated with an empirical application which investigates the relationship between an objective health indicator and a set of socio-demographic and economic covariates affected by substantial item nonresponse.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:eie:wpaper:1111&r=ecm
  11. By: Angelo Marsiglia Fasolo
    Abstract: This paper compares the properties of two particle filters – the Bootstrap Filter and the Auxiliary Particle Filter – applied to the computation of the likelihood of artificial data simulated from a basic DSGE model with nominal and real rigidities. Particle filters are compared in terms of speed, quality of the approximation of the probability density function of data and tracking of state variables. Results show that there is a case for the use of the Auxiliary Particle Filter only when the researcher uses a large number of observable variables and the number of particles used to characterize the likelihood is relatively low. Simulations also show that the largest gains in tracking state variables in the model are found when the number of particles is between 20,000 and 30,000, suggesting a boundary for this number.
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:281&r=ecm
  12. By: Jennifer L. Castle (Institute for New Economic Thinking, Oxford Martin School, University of Oxford, UK); Jurgen A. Doornik (Institute for New Economic Thinking, Oxford Martin School, University of Oxford, UK); David F. Hendry (Institute for New Economic Thinking, Oxford Martin School, University of Oxford, UK); Ragnar Nymoen (Economics Department, Oslo University, Norway)
    Abstract: Many economic models (such as the new-Keynesian Phillips curve, NKPC) include expected future values, often estimated after replacing the expected value by the actual future outcome, using Instrumental Variables or Generalized Method of Moments. Although crises, breaks and regime shifts are relatively common, the underlying theory does not allow for their occurrence. We show the consequences for such models of breaks in data processes, and propose an impulse-indicator saturation test of such specifications, applied to USA and Euro-area NKPCs.
    Keywords: Testing invariance; Structural breaks; Expectations; Impulse-indicator saturation; New-Keynesian Phillips curve
    JEL: C5 E3
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:50_12&r=ecm
  13. By: Bernardi, Mauro
    Abstract: Finite mixtures of Skew distributions have become increasingly popular in the last few years as a flexible tool for handling data displaying several different characteristics such as multimodality, asymmetry and fat-tails. Examples of such data can be found in financial and actuarial applications as well as biological and epidemiological analysis. In this paper we will show that a convex linear combination of multivariate Skew Normal mixtures can be represented as finite mixtures of univariate Skew Normal distributions. This result can be useful in modeling portfolio returns where the evaluation of extremal events is of great interest. We provide analytical formula for different risk measures like the Value-at-Risk and the Expected Shortfall probability.
    Keywords: Finite mixtures; Skew Normal distributions; Value-at-Risk; Expected Shortfall probability
    JEL: C16
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:39828&r=ecm
  14. By: Sean Muller (SALDRU, School of Economics, University of Cape Town)
    Abstract: Reichenbach's 'principle of the common cause' is a foundational assumption of some important recent contributions to quantitative social science methodology but no similar principle appears in econometrics. Reiss (2005) has argued that the principle is necessary for instrumental variables methods in econometrics, and Pearl (2009) builds a framework using it that he proposes as a means of resolving an important methodological dispute among econometricians. We aim to show, through analysis of the main problem instrumental variables methods are used to resolve, that the relationship of the principle to econometric methods is more nuanced than implied by previous work, but nevertheless may make a valuable contribution to the coherence and validity of existing methods.
    Keywords: Reichenbach's principle, econometrics, causality
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ldr:wpaper:85&r=ecm

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.