
on Econometrics 
By:  Degui Li; Dag Tjøstheim; Jiti Gao 
Abstract:  In this paper, we study parametric nonlinear regression under the Harris recurrent Markov chain framework. We first consider the nonlinear least squares estimators of the parameters in the homoskedastic case, and establish asymptotic theory for the proposed estimators. Our results show that the convergence rates for the estimators rely not only on the properties of the nonlinear regression function, but also on the number of regenerations for the Harris recurrent Markov chain. We also discuss the estimation of the parameter vector in a conditional volatility function and its asymptotic theory. Furthermore, we apply our results to the nonlinear regression with I(1) processes and establish an asymptotic distribution theory which is comparable to that obtained by Park and Phillips (2001). Some simulation studies are provided to illustrate the proposed approaches and results. 
Keywords:  Asymptotic distribution, asymptotically homogeneous functions, ?null recurrent Markov chains, Harris recurrence, integrable functions, least squares estimation, nonlinear regression. 
JEL:  C13 C22 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201214&r=ecm 
By:  Eric Hillebrand (Aarhus University and CREATES); Marcelo C. Medeiros (PONTIFICAL CATHOLIC UNIVERSITY OF RIO DE JANEIRO); Junyue Xu (LOUISIANA STATE UNIVERSITY) 
Abstract:  We derive asymptotic properties of the quasi maximum likelihood estimator of smooth transition regressions when time is the transition variable. The consistency of the estimator and its asymptotic distribution are examined. It is shown that the estimator converges at the usual squarerootofT rate and has an asymptotically normal distribution. Finite sample properties of the estimator are explored in simulations. We illustrate with an application to US inflation and output data. 
Keywords:  Regime switching, smooth transition regression, asymptotic theory. 
JEL:  C22 
Date:  2012–06–12 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201231&r=ecm 
By:  Bernardi, Mauro; Maruotti, Antonello; Lea, Petrella 
Abstract:  The derivation of loss distribution from insurance data is a very interesting research topic but at the same time not an easy task. To find an analytic solution to the loss distribution may be mislading although this approach is frequently adopted in the actuarial literature. Moreover, it is well recognized that the loss distribution is strongly skewed with heavy tails and present small, medium and large size claims which hardly can be fitted by a single analytic and parametric distribution. Here we propose a finite mixture of Skew Normal distributions that provides a better characterization of insurance data. We adopt a Bayesian approach to estimate the model, providing the likelihood and the priors for the all unknow parameters; we implement an adaptive Markov Chain Monte Carlo algorithm to approximate the posterior distribution. We apply our approach to a well known Danish fire loss data and relevant risk measures, as ValueatRisk and Expected Shortfall probability, are evaluated as well. 
Keywords:  Markov chain Monte Carlo; Bayesian analysis; mixture model; SkewNormal distributions; Loss distribution; Danish data 
JEL:  C52 C11 C01 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:39826&r=ecm 
By:  de Luna, Xavier (Umeå University); Johansson, Per (IFAU) 
Abstract:  The identification of average causal effects of a treatment in observational studies is typically based either on the unconfoundedness assumption or on the availability of an instrument. When available, instruments may also be used to test for the unconfoundedness assumption (exogeneity of the treatment). In this paper, we define variables which we call quasiinstruments because they allow us to test for the unconfoundedness assumption although they do not necessarily yield nonparametric identification of the average causal effect. A quasiinstrument is defined as an instrument except for that its relation to the treatment is allowed to be confounded by unobservables, thereby resulting in a wider range of potential applications. We propose a test for the unconfoundedness assumption based on a quasiinstrument, and give conditions under which the test has power. We perform a simulation study and apply the results to a case study where the interest lies in evaluating the effect of job practice on employment. 
Keywords:  testing, endogeneity, monotonicity, potential outcomes 
JEL:  C26 C52 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp6692&r=ecm 
By:  Daniel J. Henderson (Department of Economics, Finance and Legal Studies, University of Alabama); Subal C. Kumbhakar (Department of Economics, State University of New York); Christopher F. Parmeter (Department of Economics, University of Miami) 
Abstract:  A simple graphical approach to presenting results from nonlinear regression models is described. In the face of multiple covariates, `partial mean' plots may be unattractive. The approach here is portable to a variety of settings and can be tailored to the specific application at hand. A simple four variable nonparametric regression example is provided to illustrate the technique. 
Keywords:  Gradient Estimation;Dimensionality; Kernel Smoothing; Least Squares Cross Validation 
JEL:  C1 C13 C14 
Date:  2012–04–30 
URL:  http://d.repec.org/n?u=RePEc:mia:wpaper:20124&r=ecm 
By:  Eric Hillebrand (Aarhus University and CREATES); Marcelo C. Medeiros (PONTIFICAL CATHOLIC UNIVERSITY OF RIO DE JANEIRO) 
Abstract:  We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building procedures are proposed. The methodology is applied to stocks of the Dow Jones Industrial Average during the period 2000 to 2009. We find strong evidence of nonlinear effects. 
Keywords:  Smooth transitions, long memory, forecasting, realized volatility. 
JEL:  C22 
Date:  2012–06–12 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201230&r=ecm 
By:  Eichler Michael; Tuerk Dennis (METEOR) 
Abstract:  Recently regimeswitching models have become the standard tool for modeling electricity prices.These models capture the main properties of electricity spot prices well but estimation of themodel parameters requires computer intensive methods. Moreover, the distribution of the pricespikes must be assumed given although the high volatility of the spikes makes it difficult tocheck this assumption. Consequently, there are a number of competing proposals. Alternatively wepropose the use of a semiparametric Markov regimeswitching model that does not specify thedistribution under the spike regime. To estimate the model we use robust estimation techniques asan alternative to commonly applied estimation approaches. The model in combination with theestimation framework is easier to estimate, needs less computation time and distributionalassumptions. To show its advantages we compare the proposed model with a well establishedMarkovswitching model in a simulationstudy. Further we apply the model to Australian logprices.The results are in accordance with the results from the simulationstudy, indicating that theproposed model might be advantageous whenever the distribution of the spike process is notsufficiently known. The results are thus encouraging and suggest the use of our approach whenmodeling electricity prices and pricing derivatives. 
Keywords:  econometrics; 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:dgr:umamet:2012036&r=ecm 
By:  Tucker S. McElroy; Thomas M. Trimbur 
Abstract:  This paper advances the theory and methodology of signal extraction by introducing asymptotic and finite sample formulas for optimal estimators of signals in nonstationary multivariate time series. Previous literature has considered only univariate or stationary models. However, in current practice and research, econometricians, macroeconomists, and policymakers often combine related series  that may have stochastic trendsto attain more informed assessments of basic signals like underlying inflation and business cycle components. Here, we use a very general model structure, of widespread relevance for time series econometrics, including flexible kinds of nonstationarity and correlation patterns and specific relationships like cointegration and other common factor forms. First, we develop and prove the generalization of the wellknown WienerKolmogorov formula that maps signalnoise dynamics into optimal estimators for biinfinite series. Second, this paper gives the first explicit treatment of finitelength multivariate time series, providing a new method for computing signal vectors at any time point, unrelated to Kalman filter techniques; this opens the door to systematic study of near endpoint estimators/filters, by revealing how they jointly depend on a function of signal location and parameters. As an illustration we present econometric measures of the trend in total inflation that make optimal use of the signal content in core inflation. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201245&r=ecm 
By:  Paweł Strawiński (University of Warsaw, Faculty of Economic Sciences) 
Abstract:  A caliper mechanism is a common tool used to prevent from inexact matches. The existing literature discusses asymptotic properties of matching with caliper. In this simulation study we investigate properties in small and medium sized samples. We show that caliper causes a significant bias of the ATT estimator and raises its variance in comparison to onetoone matching. 
Keywords:  propensity score matching, caliper, Monte Carlo experiment, finite sample properties 
JEL:  C14 C21 C52 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:war:wpaper:201213&r=ecm 
By:  Valentino Dardanoni (University of Palermo); Giuseppe De Luca (ISFOL); Salvatore Modica (University of Palermo); Franco Peracchi (Tor Vergata University and EIEF) 
Abstract:  This paper considers estimation of a linear regression model using data where some covariate values are missing but imputations are available to fillin the missing values. The availability of imputations generates a tradeoff between bias and precision in the estimators of the regression parameters: the complete cases are often too few, so precision is lost, but fillingin the missing values with imputations may lead to bias. We provide the new Stata command gmi which allows handling such biasprecision tradeoff using either model reduction or model averaging techniques in the context of the generalized missingindicator approach recently proposed by Dardanoni et al.(2011). If multiple imputations are available, our gmi command can be also combined with the builtin Stata prefix mi estimate to account for the extra variability due to the imputation process. The gmi command is illustrated with an empirical application which investigates the relationship between an objective health indicator and a set of sociodemographic and economic covariates affected by substantial item nonresponse. 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:eie:wpaper:1111&r=ecm 
By:  Angelo Marsiglia Fasolo 
Abstract:  This paper compares the properties of two particle filters – the Bootstrap Filter and the Auxiliary Particle Filter – applied to the computation of the likelihood of artificial data simulated from a basic DSGE model with nominal and real rigidities. Particle filters are compared in terms of speed, quality of the approximation of the probability density function of data and tracking of state variables. Results show that there is a case for the use of the Auxiliary Particle Filter only when the researcher uses a large number of observable variables and the number of particles used to characterize the likelihood is relatively low. Simulations also show that the largest gains in tracking state variables in the model are found when the number of particles is between 20,000 and 30,000, suggesting a boundary for this number. 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:bcb:wpaper:281&r=ecm 
By:  Jennifer L. Castle (Institute for New Economic Thinking, Oxford Martin School, University of Oxford, UK); Jurgen A. Doornik (Institute for New Economic Thinking, Oxford Martin School, University of Oxford, UK); David F. Hendry (Institute for New Economic Thinking, Oxford Martin School, University of Oxford, UK); Ragnar Nymoen (Economics Department, Oslo University, Norway) 
Abstract:  Many economic models (such as the newKeynesian Phillips curve, NKPC) include expected future values, often estimated after replacing the expected value by the actual future outcome, using Instrumental Variables or Generalized Method of Moments. Although crises, breaks and regime shifts are relatively common, the underlying theory does not allow for their occurrence. We show the consequences for such models of breaks in data processes, and propose an impulseindicator saturation test of such specifications, applied to USA and Euroarea NKPCs. 
Keywords:  Testing invariance; Structural breaks; Expectations; Impulseindicator saturation; NewKeynesian Phillips curve 
JEL:  C5 E3 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:50_12&r=ecm 
By:  Bernardi, Mauro 
Abstract:  Finite mixtures of Skew distributions have become increasingly popular in the last few years as a flexible tool for handling data displaying several different characteristics such as multimodality, asymmetry and fattails. Examples of such data can be found in financial and actuarial applications as well as biological and epidemiological analysis. In this paper we will show that a convex linear combination of multivariate Skew Normal mixtures can be represented as finite mixtures of univariate Skew Normal distributions. This result can be useful in modeling portfolio returns where the evaluation of extremal events is of great interest. We provide analytical formula for different risk measures like the ValueatRisk and the Expected Shortfall probability. 
Keywords:  Finite mixtures; Skew Normal distributions; ValueatRisk; Expected Shortfall probability 
JEL:  C16 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:39828&r=ecm 
By:  Sean Muller (SALDRU, School of Economics, University of Cape Town) 
Abstract:  Reichenbach's 'principle of the common cause' is a foundational assumption of some important recent contributions to quantitative social science methodology but no similar principle appears in econometrics. Reiss (2005) has argued that the principle is necessary for instrumental variables methods in econometrics, and Pearl (2009) builds a framework using it that he proposes as a means of resolving an important methodological dispute among econometricians. We aim to show, through analysis of the main problem instrumental variables methods are used to resolve, that the relationship of the principle to econometric methods is more nuanced than implied by previous work, but nevertheless may make a valuable contribution to the coherence and validity of existing methods. 
Keywords:  Reichenbach's principle, econometrics, causality 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ldr:wpaper:85&r=ecm 