|
on Econometrics |
By: | Dinghai Xu (Department of Economics, University of Waterloo) |
Abstract: | This paper investigates an e±cient estimation method for a class of switching regressions based on the characteristic function (CF). We show that with the exponential weighting function, the CF based estimator can be achieved from minimizing a closed form distance measure. Due to the availability of the analytical structure of the asymptotic covariance, an iterative estimation procedure is developed involving the minimization of a precision measure of the asymptotic covariance matrix. Numerical examples are illustrated via a set of Monte Carlo experiments examining the implentability, Finite sample property and e±ciency of the proposed estimator. |
Keywords: | Switching Regression model, Characteristic Function; Integrated Squared Error; Gaussian Mixtures. |
JEL: | E50 E61 |
Date: | 2009–04 |
URL: | http://d.repec.org/n?u=RePEc:wat:wpaper:0903&r=ecm |
By: | Charalambos G. Tsangarides; Alin Mirestean; Huigang Chen |
Abstract: | Bayesian Model Averaging (BMA) provides a coherent mechanism to address the problem of model uncertainty. In this paper we extend the BMA framework to panel data models where the lagged dependent variable as well as endogenous variables appear as regressors. We propose a Limited Information Bayesian Model Averaging (LIBMA) methodology and then test it using simulated data. Simulation results suggest that asymptotically our methodology performs well both in Bayesian model selection and averaging. In particular, LIBMA recovers the data generating process very well, with high posterior inclusion probabilities for all the relevant regressors, and parameter estimates very close to the true values. These findings suggest that our methodology is well suited for inference in dynamic panel data models with short time periods in the presence of endogenous regressors under model uncertainty. |
Date: | 2009–04–17 |
URL: | http://d.repec.org/n?u=RePEc:imf:imfwpa:09/74&r=ecm |
By: | Mutl, Jan (Department of Economics, Institute for Advanced Studies, Vienna, Austria) |
Abstract: | I consider a panel vector-autoregressive model with cross-sectional dependence of the disturbances characterized by a spatial autoregressive process. I propose a three-step estimation procedure. Its first step is an instrumental variable estimation that ignores the spatial correlation. In the second step, the estimated disturbances are used in a multivariate spatial generalized moments estimation to infer the degree of spatial correlation. The final step of the procedure uses transformed data and applies standard techniques for estimation of panel vector-autoregressive models. I compare the small-sample performance of various estimation strategies in a Monte Carlo study. |
Keywords: | Spatial PVAR, Multivariate dynamic panel data model, Spatial GM, Spatial Cochrane-Orcutt transformation, Constrained maximum likelihood estimation |
JEL: | C13 C31 C33 |
Date: | 2009–03 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:237&r=ecm |
By: | Yingcun Xia; Wolfgang Härdle; Oliver Linton |
Abstract: | In semiparametric models it is a common approach to under-smooth the nonparametric functions in order that estimators of the finite dimensional parameters can achieve root-n consistency. The requirement of under-smoothing may result as we show from inefficient estimation methods or technical difficulties. Based on local linear kernel smoother, we propose an estimation method to estimate the single-index model without under-smoothing. Under some conditions, our estimator of the single-index is asymptotically normal and most efficient in the semi-parametric sense. Moreover, we derive higher expansions for our estimator and use them to define an optimal bandwidth for the purposes of index estimation. As a result we obtain a practically more relevant method and we show its superior performance in a variety of applications. |
Keywords: | ADE, Asymptotics, Bandwidth, MAVE method, Semi-parametric efficiency |
JEL: | C00 C13 C14 |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2009-028&r=ecm |
By: | Ioannis Kasparis; Peter C. B. Phillips |
Abstract: | Linear cointegration is known to have the important property of invariance under temporal translation. The same property is shown not to apply for nonlinear cointegration. The requisite limit theory involves sample covariances of integrable transformations of non-stationary sequences and time translated sequences, allowing for the presence of a bandwidth parameter so as to accommodate kernel regression. The theory is an extension of Wang and Phillips (2008) and is useful for the analysis of nonparametric regression models with a misspecified lag structure and in situations where temporal aggregation issues arise. The limit properties of the Nadaraya-Watson (NW) estimator for cointegrating regression under misspecified lag structure are derived, showing the NW estimator to be inconsistent with a "pseudo-true function" limit that is a local average of the true regression function. In this respect nonlinear cointegrating regression differs importantly from conventional linear cointegration which is invariant to time translation. When centred on the pseudo-function and appropriately scaled, the NW estimator still has a mixed Gaussian limit distribution. The convergence rates are the same as those obtained under correct specification but the variance of the limit distribution is larger. Some applications of the limit theory to non-linear distributed lag cointegrating regression are given and the practical import of the results for index models, functional regression models, and temporal aggregation are discussed. |
Keywords: | Dynamic misspecification, Functional regression, Integrable function, Integrated process, Local time, Misspecification, Mixed normality, Nonlinear cointegration, Nonparametric regression |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:ucy:cypeua:2-2009&r=ecm |
By: | Dominique Guegan (Centre d'Economie de la Sorbonne - Paris School of Economics); Zhiping Lu (Centre d'Economie de la Sorbonne et East China Normal University) |
Abstract: | Long memory processes have been extensively studied over the past decades. When dealing with the financial and economic data, seasonality and time-varying long-range dependence can often be observed and thus some kind of non-stationarity can exist inside financial data sets. To take into account this kind of phenomena, we propose a new class of stochastic process : the locally stationary k-factor Gegenbauer process. We describe a procedure of estimating consistently the time-varying parameters by applying the discrete wavelet packet transform (DWPT). The robustness of the algorithm is investigated through simulation study. An application based on the error correction term of fractional cointegration analysis of the Nikkei Stock Average 225 index is proposed. |
Keywords: | Discrete wavelet packet transform, Gegenbauer process, Nikkei Stock Average 225 index, non-stationarity, ordinary least square estimation. |
JEL: | C13 C14 C15 C22 C63 G15 |
Date: | 2009–03 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:09015&r=ecm |
By: | T. W. Anderson (Department of Statistics and Department of Economics, Stanford University); Naoto Kunitomo (Faculty of Economics, University of Tokyo); Yukitoshi Matsushita (JSPS and Graduate School of Economics, University of Tokyo) |
Abstract: | When an econometric structural equation includes two endogenous variables and their coefficients are normalized so that their sum of squares is 1, it is natural to express them as the sine and cosine of an angle. The Limited Information Maximum Likelihood (LIML) estimator of this angle when the error covariance matrix is known has constant variance. Of all estimators with constant variance the LIML estimator minimizes the variance. Competing estimators, such as the Two-Stage Least Squares estimator, has much larger variance for some values of the parameter. The effect of weak instruments is studied. |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2009cf619&r=ecm |
By: | Muhammad Akram (Department of Econometrics and Business Statistics); Rob J Hyndman (Department of Econometrics and Business Statistics,Monash University); J. Keith Ord (McDonough School of Business,Georgetown University) |
Abstract: | The most common forecasting methods in business are based on exponential smoothing and the most common time series in business are inherently non-negative. Therefore it is of interest to consider the properties of the potential stochastic models underlying exponential smoothing when applied to non-negative data. We explore exponential smoothing state space models for non-negative data under various assumptions about the innovations, or error, process. We first demonstrate that prediction distributions from some commonly used state space models may have an infinite variance beyond a certain forecasting horizon. For multiplicative error models which do not have this flaw, we show that sample paths will converge almost surely to zero even when the error distribution is non-Gaussian. We propose a new model with similar properties to exponential smoothing, but which does not have these problems, and we develop some distributional properties for our new model. We then explore the implications of our results for inference, and compare the short-term forecasting performance of the various models using data on the weekly sales of over three hundred items of costume jewelry. The main findings of the research are that the Gaussian approximation is adequate for estimation and one-step-ahead forecasting. However, as the forecasting horizon increases, the approximate prediction intervals become increasingly problematic. When the model is to be used for simulation purposes, a suitably specified scheme must be employed. |
Keywords: | forecasting; time series; exponential smoothing; positive-valued processes; seasonality; state space models. |
JEL: | C1 C5 |
Date: | 2008–07 |
URL: | http://d.repec.org/n?u=RePEc:gwc:wpaper:2008-003&r=ecm |
By: | Sergio Urzua (Northwestern University); James J. Heckman (University of Chicago & University College Dublin) |
Abstract: | This paper compares the economic questions addressed by instrumental variables estimators with those addressed by structural approaches. We discuss Marschak's Maxim: estimators should be selected on the basis of their ability to answer well-posed economic problems with minimal assumptions. A key identifying assumption that allows structural methods to be more informative than IV can be tested with data and does not have to be imposed. |
Keywords: | Instrumental Variables, Structural Approaches, Marschak's Maxim |
Date: | 2009–03–09 |
URL: | http://d.repec.org/n?u=RePEc:ucd:wpaper:200906&r=ecm |
By: | Lanouar Charfeddine (OEP - Université de Marne-la-Vallée); Dominique Guegan (Centre d'Economie de la Sorbonne - Paris School of Economics) |
Abstract: | Are structural breaks models true switching models or long memory processes ? The answer to this question remain ambiguous. A lot of papers, in recent years, have dealt with this problem. For instance, Diebold and Inoue (2001) and Granger and Hyung (2004) show, under specific conditions, that switching models and long memory processes can be easily confused. In this paper, using several generating models like the mean-plus-noise model, the STOchastic Permanent BREAK model, the Markov switching model, the TAR model, the sign model and the Structural CHange model (SCH) and several estimation techiques like the GPH technique, the Exact Local Whittle (ELW) and the Wavelet methods, we show that, if the answer is quite simple in some cases, it can be mitigate in other cases. Using French and American inflation rates, we show that these series cannot be characterized by the same class of models. The main result of this study suggests that estimating the long memory parameter without taking account existence of breaks in the data sets may lead to misspecification and to overestimate the true parameter. |
Keywords: | Structural breaks models, Spurious long memory behavior, inflation series. |
JEL: | C13 C32 E3 |
Date: | 2009–04 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:09022&r=ecm |
By: | Antonio Lijoi; Igor Pruenster; Stephen G. Walker |
Abstract: | We consider discrete nonparametric priors which induce Gibbs-type exchangeable random partitions and investigate their posterior behavior in detail. In particular, we deduce conditional distributions and the corresponding Bayesian nonparametric estimators, which can be readily exploited for predicting various features of additional samples. The results provide useful tools for genomic applications where prediction of future outcomes is required. |
Keywords: | Bayesian nonparametric inference; Exchangeable random partitions; Generalized factorial coeffcients; Generalized gamma process; Poisson-Dirichlet process; Population genetics. |
Date: | 2008–06 |
URL: | http://d.repec.org/n?u=RePEc:icr:wpmath:06-2008&r=ecm |
By: | Richard T. Baille; Claudio Morana |
Abstract: | Previous models of monthly CPI inflation time series have focused on possible regime shifts, non-linearities and the feature of long memory. This paper proposes a new time series model, named Adaptive ARFIMA; which appears well suited to describe inflation and potentially other economic time series data. The Adaptive ARFIMA model includes a time dependent intercept term which follows a Flexible Fourier Form. The model appears to be capable of succesfully dealing with various forms of breaks and discontinities in the conditional mean of a time series. Simulation evidence justifies estimation by approximate MLE and model specfication through robust inference based on QMLE. The Adaptive ARFIMA model when supplemented with conditional variance models is found to provide a good representation of the G7 monthly CPI inflation series. |
Keywords: | ARFIMA; FIGARCH, long memory, structural change, inflation, G7. |
JEL: | C15 C22 |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:icr:wpmath:06-2009&r=ecm |
By: | Antonio Lijoi; Igor Pruenster; Stephen G. Walker |
Abstract: | Recently, James [15, 16] has derived important results for various models in Bayesian nonparametric inference. In particular, he dened a spatial version of neutral to the right processes and derived their posterior distribution. Moreover, he obtained the posterior distribution for an intensity or hazard rate modeled as a mixture under a general multiplicative intensity model. His proofs rely on the so{called Bayesian Poisson partition calculus. Here we provide new proofs based on an alternative technique. |
Keywords: | Bayesian Nonparametrics; Completely random measure; Hazard rate; Neutral to the right prior; Multiplicative intensity model. |
Date: | 2008–06 |
URL: | http://d.repec.org/n?u=RePEc:icr:wpmath:05-2008&r=ecm |
By: | Ahlgren, Niklas (Hanken School of Economics); Juselius, Mikael (University of Helsinki) |
Abstract: | Many economic events involve initial observations that substantially deviate from long-run steady state. Initial conditions of this type have been found to impact diversely on the power of univariate unit root tests, whereas the impact on multivariate tests is largely unknown. This paper investigates the impact of the initial condition on tests for cointegration rank. We compare the local power of the widely used likelihood ratio (LR) test with the local power of a test based on the eigenvalues of the companion matrix. We find that the power of the LR test is increasing in the magnitude of the initial condition, whereas the power of the other test is decreasing. The behaviour of the tests is investigated in an application to price convergence. |
Keywords: | asymptotic local power; cointegration; companion matrix; convergence; initial condition; likelihood ratio test; unit root |
Date: | 2009–05–13 |
URL: | http://d.repec.org/n?u=RePEc:hhb:hanken:0539&r=ecm |
By: | Frisén, Marianne (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University); Andersson, Eva (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University); Schiöler, Linus (Statistical Research Unit, Department of Economics, School of Business, Economics and Law, Göteborg University) |
Abstract: | Multivariate surveillance is of interest in many areas such as industrial production, bioterrorism detection, spatial surveillance, and financial transaction strategies. Some of the suggested approaches to multivariate surveillance have been multivariate counterparts to the univariate Shewhart, EWMA, and CUSUM methods. Our emphasis is on the special challenges of evaluating multivariate surveillance methods. Some new measures are suggested and the properties of several measures are demonstrated by applications to various situations. It is demonstrated that zero-state and steady-state ARL, which are widely used in univariate surveillance, should be used with care in multivariate surveillance. |
Keywords: | average run length; delay; EWMA; false alarms; FDR; performance metrics; predictive value; steady state; zero state |
JEL: | C10 |
Date: | 2009–05–15 |
URL: | http://d.repec.org/n?u=RePEc:hhs:gunsru:2009_001&r=ecm |
By: | Jean-Pierre H. Dubé; Jeremy T. Fox; Che-Lin Su |
Abstract: | The widely-used estimator of Berry, Levinsohn and Pakes (1995) produces estimates of consumer preferences from a discrete-choice demand model with random coefficients, market-level demand shocks and endogenous prices. We derive numerical theory results characterizing the properties of the nested fixed point algorithm used to evaluate the objective function of BLP's estimator. We discuss problems with typical implementations, including cases that can lead to incorrect parameter estimates. As a solution, we recast estimation as a mathematical program with equilibrium constraints, which can be faster and which avoids the numerical issues associated with nested inner loops. The advantages are even more pronounced for forward-looking demand models where Bellman's equation must also be solved repeatedly. Several Monte Carlo and real-data experiments support our numerical concerns about the nested fixed point approach and the advantages of constrained optimization. |
JEL: | C01 C61 L0 |
Date: | 2009–05 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:14991&r=ecm |
By: | Christophe Ley; Davy Paindaveine |
Abstract: | In recent years, the skew-normal models introduced by Azzalini (1985)—and their multivariate generalizations from Azzalini and Dalla Valle (1996)—have enjoyed an amazing success, although an important literature has reported that they exhibit, in the vicinity of symmetry, singular Fisher information matrices and stationary points in the profile log-likelihood function for skewness, with the usual unpleasant consequences for inference. It has been shown (DiCiccio and Monti 2004, 2009) that these singularities, in some specific parametric extensions of skew-normal models (such as the classes of skew-exponential or skew-t distributions), appear at skew-normal distributions only. Yet, an important question remains open: in broader semiparametric models of skewed distributions (such as the general skew-symmetric and skew-elliptical ones), which symmetric kernels lead to such singularities? The present paper provides an answer to this question. In very general (possibly multivariate) skew-symmetric models, we characterize, for each possible value of the rank of Fisher information matrices, the class of symmetric kernels achieving the corresponding rank. Our results show that, for strictly multivariate skew-symmetric models, not only Gaussian kernels yield singular Fisher information matrices. In contrast, we prove that systematic stationary points in the profile log-likelihood functions are obtained for (multi)normal kernels only. Finally, we also discuss the implications of such singularities on inference. |
Keywords: | Characterization property, profile likelihood, reparameterization, singular Fisher information matrix, skewness, skew-normal distributions |
Date: | 2009 |
URL: | http://d.repec.org/n?u=RePEc:eca:wpaper:2009_017&r=ecm |