
on Econometrics 
By:  Dominique Guegan (Centre d'Economie de la Sorbonne  Paris School of Economics); Patrick Rakotomarolahy (Centre d'Economie de la Sorbonne) 
Abstract:  An empirical forecast accuracy comparison of the nonparametric method, known as multivariate Nearest Neighbor method, with parametric VAR modelling is conducted on the euro area GDP. Using both methods for nowcasting and forecasting the GDP, through the estimation of economic indicators plugged in the bridge equations, we get more accurate forecasts when using nearest neighbor method. We prove also the asymptotic normality of the multivariate knearest neighbor regression estimator for dependent time series, providing confidence intervals for point forecast in time series. 
Keywords:  Forecast, economic indicators, GDP, Euro area, VAR, multivariate knearest neighbor regression, asymptotic normality. 
JEL:  C22 C53 E32 
Date:  2010–07 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:10065&r=ecm 
By:  Florens, JeanPierre; Johannes, Jan; Van Bellegem, Sébastien 
Abstract:  We consider the semiparametric regression Xtβ+φ(Z) where β and φ(·) are unknown slope coefficient vector and function, and where the variables (X,Z) are endogeneous. We propose necessary and sufficient conditions for the identification of the parameters in the presence of instrumental variables. We also focus on the estimation of β. An incorrect parameterization of φ may generally lead to an inconsistent estimator of β, whereas even consistent nonparametric estimators for φ imply a slow rate of convergence of the estimator of β. An additional complication is that the solution of the equation necessitates the inversion of a compact operator that has to be estimated nonparametrically. In general this inversion is not stable, thus the estimation of β is illposed. In this paper, a √nconsistent estimator for β is derived under mild assumptions. One of these assumptions is given by the socalled source condition that is explicitly interprated in the paper. Finally we show that the estimator achieves the semiparametric efficiency bound, even if the model is heteroscedastic. Monte Carlo simulations demonstrate the reasonable performance of the estimation procedure on finite samples. 
JEL:  C14 C30 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22798&r=ecm 
By:  Paulo M.M. Rodrigues; Antonio Rubia 
Abstract:  This paper discusses the asymptotic and finitesample properties of CUSUMbased tests for detecting structural breaks in volatility in the presence of stochastic contamination, such as additive outliers or measurement errors. This analysis is particularly relevant for financial data, on which these tests are commonly used to detect variance breaks. In particular, we focus on the tests by Inclán and Tiao [IT] (1994) and Kokoszka and Leipus [KL] (1998, 2000), which have been intensively used in the applied literature. Our results are extensible to related procedures. We show that the asymptotic distribution of the IT test can largely be affected by sample contamination, whereas the distribution of the KL test remains invariant. Furthermore, the breakpoint estimator of the KL test renders consistent estimates. In spite of the good largesample properties of this test, large additive outliers tend to generate power distortions or wrong breakdate estimates in small samples. 
JEL:  C12 C15 C52 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201011&r=ecm 
By:  Florens, JeanPierre 
Abstract:  This paper gives a survey of econometric models characterized by a relation between observable and unobservable random elements where these unobservable terms are assumed to be independent of another set of observable variables called instrumental variables. This kind of specification is usefull to address the question of endogeneity or of selection bias for example. These models are treated non parametrically and in all the example we consider the functional parameter of interest is defined as the solution of a linear or non linear integral equation. The estimation procedure then requires to solve a (generally illposed) inverse problem. We illustrate the main questions (construction of the equation, identification, numerical solution, asymptotic properties, selection of the regularization parameter) by the different models we present. 
Date:  2010–06 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22806&r=ecm 
By:  António Alberto Santos (GEMF/Faculdade de Economia, Universidade de Coimbra, Portugal) 
Abstract:  In this article we deal with the identification problem within the Dynamic Linear Models family and show that using Bayesian estimation procedures we can deal better with these problems in comparison with the traditional Maximum Likelihood estimation approach. Using a Bayesian approach supported by Markov chain Monte Carlo techniques, we obtain the same results as the Maximum likelihood approach in the case of identifiable models, but in the case of nonidentifiable models, we were able to estimate the parameters that are identifiable, as well as to pinpoint the troublesome parameters. Assuming a Bayesian approach, we also discuss the computational aspects, namely the ongoing discussion between single versus multimove samplers. Our aim is to give a clear example of the benefits of adopting a Bayesian approach to the estimation of high dimensional statistical models. 
Keywords:  Bayesian Statistics, DLM Models, Markov chain Monte Carlo, Maximum Likelihood, Model Identification. 
Date:  2010–06 
URL:  http://d.repec.org/n?u=RePEc:gmf:wpaper:201012&r=ecm 
By:  Nikolay Gospodinov; Raymond Kan; Cesare Robotti 
Abstract:  In this paper, we extend the results in Hansen (1982) regarding the asymptotic distribution of generalized method of moments (GMM) sample moment conditions. In particular, we show that the part of the scaled sample moment conditions that gives rise to degeneracy in the asymptotic normal distribution is Tconsistent and has a nonstandard limiting distribution. We derive the asymptotic distribution for a given linear combination of the sample moment conditions and show how to conduct statistical inference. We demonstrate the finitesample properties of the proposed asymptotic approximation using simulation. 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:201011&r=ecm 
By:  David Vincent (Hewlett Packard (UK)) 
Abstract:  Consistency of the maximum likelihood estimators for the parameters in the standard Tobit model rely heavily on the assumption of a normally distributed error term. The Box Cox transformation presents an obvious attempt to preserve normality when the data make this questionable. This paper sets out an OPG version of an LM test for the null hypotheses of the standard Tobit model, against the alternative of a more general nonlinear specification, as determined by the parameter of the Box Cox transformation. Monte Carlo estimates of the rejection probabilities using first order asymptotic and parametric bootstrap critical values are obtained, for sample sizes that are comparable to those used in practice. The results show that the LMtest using bootstrap critical values has practically no size distortion, whereas using asymptotic critical values, the empirical rejection probabilities are significantly larger than the nominal levels. A simple program which carries out this test using bootstrap critical values has also been written and can be run post the official Stata Tobit estimation command. 
Date:  2010–07–20 
URL:  http://d.repec.org/n?u=RePEc:boc:bost10:9&r=ecm 
By:  Florens, JeanPierre; Simoni, Anna 
Abstract:  We consider statistical linear inverse problems in Hilbert spaces of the type ˆ Y = Kx + U where we want to estimate the function x from indirect noisy functional observations ˆY . In several applications the operator K has an inverse that is not continuous on the whole space of reference; this phenomenon is known as illposedness of the inverse problem. We use a Bayesian approach and a conjugateGaussian model. For a very general specification of the probability model the posterior distribution of x is known to be inconsistent in a frequentist sense. Our first contribution consists in constructing a class of Gaussian prior distributions on x that are shrinking with the measurement error U and we show that, under mild conditions, the corresponding posterior distribution is consistent in a frequentist sense and converges at the optimal rate of contraction. Then, a class ^ of posterior mean estimators for x is given. We propose an empirical Bayes procedure for selecting an estimator in this class that mimics the posterior mean that has the smallest risk on the true x. 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22814&r=ecm 
By:  Dominique Guegan (Centre d'Economie de la Sorbonne  Paris School of Economics); Zhiping Lu (School of Finance and Statistics  East China Normal University) 
Abstract:  Foreign exchange rate plays an important role in international finance. This paper examines unit roots and the long range dependence of 23 foreign exchange rates using Robinson's (1994) test, which is one of the most efficient tests when testing fractional orders of seasonal/cyclical long memory processes. Monte Carlo simulations are carried out to explore the accuracy of the test before implementing the empirical applications. 
Keywords:  Foreign exchange rate, long memory processes, Monte Carlo simulation, nonstationary, test. 
JEL:  C12 C15 C22 
Date:  2010–06 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:10059&r=ecm 
By:  Nikolaus Hautsch; Mark Podolskij 
Abstract:  This paper provides theory as well as empirical results for preaveraging estimators of the daily quadratic variation of asset prices. We derive jump robust inference for preaveraging estimators, corresponding feasible central limit theorems and an explicit test on serial dependence in microstructure noise. Using transaction data of different stocks traded at the NYSE, we analyze the estimators’ sensitivity to the choice of the preaveraging bandwidth and suggest an optimal interval length. Moreover, we investigate the dependence of preaveraging based inference on the sampling scheme, the sampling frequency, microstructure noise properties as well as the occurrence of jumps. As a result of a detailed empirical study we provide guidance for optimal implementation of preaveraging estimators and discuss potential pitfalls in practice. 
Keywords:  Quadratic Variation, Market Microstructure Noise, Preaveraging, Sampling Schemes, Jumps 
JEL:  C14 C22 G10 
Date:  2010–07 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2010038&r=ecm 
By:  Miguel de Carvalho; K. Feridum Turkman; António Rua 
Abstract:  Considerable attention has been devoted to the statistical analysis of extreme events. Classical peaks over threshold methods are a popular modelling strategy for extreme value statistics of stationary data. For nonstationary series a variant of the peaks over threshold analysis is routinely applied using covariates as a means to overcome the lack of stationarity in the series of interest. In this paper we concern ourselves with extremes of possibly nonstationary processes. Given that our approach is, in some way, linked to the celebrated BoxJenkins method, we refer to the procedure proposed and applied herein as BoxJenkinsPareto. Our procedure is particularly appropriate for settings where the parameter covariate model is nontrivial or when well qualified covariates are simply unavailable. We apply the BoxJenkinsPareto approach to the weekly number of unemployment insurance claims in the US and exploit the connection between threshold exceedances and the US business cycle. 
JEL:  C16 C50 E32 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201003&r=ecm 
By:  Daouia, Abdelaati; Florens, JeanPierre; Simar, Léopold 
Abstract:  In production theory and efficiency analysis, we are interested in estimating the production frontier which is the locus of the maximal attainable level of an output (the production), given a set of inputs (the production factors). In other setups, we are rather willing to estimate an input (or cost) frontier that is defined as the minimal level of the input (cost) attainable for a given set of outputs (goods or services produced). In both cases the problem can be viewed as estimating a surface under shape constraints (monotonicity, . . . ). In this paper we derive the theory of an estimator of the frontier having an asymptotic normal distribution. The basic tool is the orderm partial frontier where we let the order m to converge to infinity when n ! 1 but at a slow rate. The final estimator is then corrected for its inherent bias. We thus can view our estimator as a regularized frontier estimator which, in addition, is more robust to extreme values and outliers than the usual nonparametric frontier estimators, like FDH. The performances of our estimators are evaluated in finite samples through some MonteCarlo experiments. We illustrate also how to provide, in an easy way, confidence intervals for the frontier function both with a simulated data set and a real data set. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22808&r=ecm 
By:  Sojourner, Aaron J. (University of Minnesota) 
Abstract:  Economists often analyze crosssectional data to estimate the value people implicit place on attributes of goods using hedonic methods. Usually strong enough assumptions are made on the functional form of utility to point identify individuals' willingnesstopay (WTP) for changes in attribute levels. Instead, this paper develops a new way to partially identify WTP under a weak set of conditions on the shape of individual indifference curves. In particular, indifference curves are assumed to be increasing and convex in an attributecost space that is finitely bounded above. These shape restrictions provide informative partial identification without relying on functional form restrictions for utility. Identification given general, potentially discrete, as well as smooth price functions is analyzed. To illustrate this method, we contribute to the literature on the value of a statistical life (VSL) by analyzing labor market data to study people's willingness to pay (WTP) for reductions in levels of fatal risk. The paper contrasts VSL estimates from conventional analysis with the bounds obtained under this new approach using a common data set. The data are shown to be consistent with a wide range of WTP values even given equilibrium and credible shape restrictions. This suggests that conventional estimates may be driven by functional form restrictions imposed on utility rather than by the data or properties of equilibrium. 
Keywords:  hedonic, partial identification, value of a statistical life, shape restrictions 
JEL:  C14 I10 J28 
Date:  2010–07 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp5066&r=ecm 
By:  Antonis Demos (www.aueb.gr/users/demos); Stelios Arvanitis 
Abstract:  In this paper we define a set of indirect estimators based on moment approximations of the auxilary estimators. We provide results that describe higher order asymptotic properties of these estimators. The introduction of these is motivated by reasons of analytical and computational facilitation. We extend this set to a class of multistep indirect estimators that have potentially useful higher order bias properties. Furthermore, the widely employed "feasibly biased corrected estimator" is an one optimazation step approxiamtion of the suggested on. 
Keywords:  Indirect Estimator, Asymptotic Approximation, Moment Approximation, Higher Order Bias Structure, Binding Function, Local Canonical Representation, Convex Variational Distance. 
Date:  2010–07–22 
URL:  http://d.repec.org/n?u=RePEc:aue:wpaper:1023&r=ecm 
By:  Maximiano Pinheiro 
Abstract:  Marginal probability density and cumulative distribution functions are presented for multidimensional variables defined by nonsingular affine transformations of vectors of independent twopiece normal variables, the most important subclass of Ferreira and Steel’s general multivariate skewed distributions. The marginal functions are obtained by first expressing the joint density as a mixture of ArellanoValle and Azzalini’s unified skewnormal densities and then using the property of closure under marginalization of the latter class. 
JEL:  C16 C46 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201013&r=ecm 
By:  António Rua 
Abstract:  It has been acknowledged that wavelets can constitute a useful tool for forecasting in economics. Through a wavelet multiresolution analysis, a time series can be decomposed into different timescale components and a model can be fitted to each component to improve the forecast accuracy of the series as a whole. Up to now, the literature on forecasting with wavelets has mainly focused on univariate modelling. On the other hand, in a context of growing data availability, a line of research has emerged on forecasting with large datasets. In particular, the use of factoraugmented models have become quite widespread in the literature and among practitioners. The aim of this paper is to bridge the two strands of the literature. A wavelet approach for factoraugmented forecasting is proposed and put to test for forecasting GDP growth for the major euro area countries. The results show that the forecasting performance is enhanced when wavelets and factoraugmented models are used together. 
JEL:  C22 C40 C53 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201007&r=ecm 
By:  Austin Nichols (Urban Institute) 
Abstract:  Several options for estimation and prediction in regressions using nonnegative skewed dependent variables are compared. Often, Poisson regression outperforms competitors, even when its assumptions are violated and the correct model is one that justifies a competitor. 
Date:  2010–07–20 
URL:  http://d.repec.org/n?u=RePEc:boc:bost10:2&r=ecm 
By:  Wojciech Maliszewski 
Abstract:  The paper constructs a new output gap measure for Vietnam by applying Bayesian methods to a twoequation ASAD model, while treating the output gap as an unobservable series to be estimated together with other parameters. Model coefficients are easily interpretable, and the output gap series is consistent with a broader analysis of economic developments. Output gaps obtained from the HP detrending are subject to larger revisions than series obtained from a suitably adjusted model, and may be misleading compared to the modelbased measure. 
Date:  2010–06–25 
URL:  http://d.repec.org/n?u=RePEc:imf:imfwpa:10/149&r=ecm 
By:  António Rua 
Abstract:  The measurement of comovement among variables has a long tradition in the economic and financial literature. Traditionally, comovement is assessed in the time domain through the wellknown correlation coefficient while the evolving properties are investigated either through a rolling window or by considering nonoverlapping periods. More recently, Croux, Forni and Reichlin [Review of Economics and Statistics 83 (2001)] have proposed a measure of comovement in the frequency domain. While it allows to quantify the comovement at the frequency level, such a measure disregards the fact that the strength of the comovement may vary over time. Herein, it is proposed a new measure of comovement resorting to wavelet analysis. This waveletbased measure allows one to assess simultaneously the comovement at the frequency level and over time. In this way, it is possible to capture the time and frequency varying features of comovement within a unified framework which constitutes a refinement to previous approaches. 
JEL:  C40 E32 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201001&r=ecm 
By:  Miguel de Carvalho; Paulo C. Rodrigues; António Rua 
Abstract:  The monitoring of economic developments is an exercise of considerable importance forpolicy makers, namely, central banks and fiscal authorities as well as for other economic agents such as financial intermediaries, firms and households. However, the assessment of the business cycle is not an easy endeavor as the cyclical component is not an observable variable. In this paper we resort to singular spectrum analysis in order to disentangle the US GDP into several underlying components of interest. The business cycle indicator yielded through this method is shown to bear a resemblance with bandpass filtered output. As the endofsample behavior is typically a thorny issue in business cycle assessment, a realtime estimation exercise is here conducted to assess the reliability of the several filters. The obtained results suggest that the business cycle indicator proposed herein possesses a better revision performance than other filters commonly applied in the literature. 
JEL:  C50 E32 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201009&r=ecm 
By:  Florens, JeanPierre; Simon, Guillaume 
Abstract:  The objective of the paper is to draw the theory of endogeneity in dynamic models in discrete and continuous time, in particular for diffusions and counting processes. We first provide an extension of the separable setup to a separable dynamic framework given in term of semimartingale decomposition. Then we define our function of interest as a stopping time for an additional noise process, whose role is played by a Brownian motion for diffusions, and a Poisson process for counting processes. 
JEL:  C14 C32 C51 
Date:  2010–04 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22803&r=ecm 
By:  Cazals, Catherine; Dudley, Paul; Florens, JeanPierre; Jones, Michael 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ide:wpaper:22802&r=ecm 
By:  Miguel de Carvalho; António Rua 
Abstract:  The statistical modelling of extreme values has recently received substantial attention in a broad spectrum of sciences. Given that in a wide variety of scenarios, one is mostly concerned with explaining tail events (say, an economic recession) than central ones, the need to rely on statistical methods well qualified for modelling extremes arises. Unfortunately, several classical tools regularly applied in the analysis of central events, are simply innapropriate for the analysis of extreme values. In particular, Pearson correlation is not a proper measure for assessing the level of agreement of two variables when one is concerned with tail events. This paper explores the comovement of the economic activity of several OECD countries during periods of large positive and negative growth (right and left tails, respectively). Extremal measures are here applied as means to assess the degree of crosscountry tail dependence of output growth rates. Our main empirical findings are: (i) the comovement is much stronger in left tails than in right tails; (ii) asymptotic independence is claimed by the data; (iii) the dependence in the tails is considerably stronger than the one arising from a Gaussian dependence model. In addition, our results suggest that, among the typical determinants for explaining international output growth synchronization, only economic specialization similarity seems to play a role at extreme events.<br> 
JEL:  C40 C50 E32 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201008&r=ecm 