
on Econometrics 
By:  Gach, Florian; Pötscher, Benedikt M. 
Abstract:  Indirect inference estimators (i.e., simulationbased minimum distance estimators) in a parametric model that are based on auxiliary nonparametric maximum likelihood density estimators are shown to be asymptotically normal. If the parametric model is correctly specified, it is furthermore shown that the asymptotic variancecovariance matrix equals the CramérRao bound. These results are based on uniforminparameters convergence rates and a uniforminparameters Donskertype theorem for nonparametric maximum likelihood density estimators. 
Keywords:  Indirect inference; simulationbased minimum distance estimation; nonparametric maximum likelihood; density estimation; efficiency 
JEL:  C13 C14 C15 
Date:  2010–12–16 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27512&r=ecm 
By:  Shuowen Hu; D.S. Poskitt; Xibin Zhang 
Abstract:  Kernel density estimation is an important technique for understanding the distributional properties of data. Some investigations have found that the estimation of a global bandwidth can be heavily affected by observations in the tail. We propose to categorize data into low and highdensity regions, to which we assign two different bandwidths called the lowdensity adaptive bandwidths. We derive the posterior of the bandwidth parameters through the KullbackLeibler information. A Bayesian sampling algorithm is presented to estimate the bandwidths. Monte Carlo simulations are conducted to examine the performance of the proposed Bayesian sampling algorithm in comparison with the performance of the normal reference rule and a Bayesian sampling algorithm for estimating a global bandwidth. According to KullbackLeibler information, the kernel density estimator with lowdensity adaptive bandwidths estimated through the proposed Bayesian sampling algorithm outperforms the density estimators with bandwidth estimated through the two competitors. We apply the lowdensity adaptive kernel density estimator to the estimation of the bivariate density of daily stockindex returns observed from the U.S. and Australian stock markets. The derived conditional distribution of the Australian stockindex return for a given daily return in the U.S. market enables market analysts to understand how the former market is associated with the latter. 
Keywords:  conditional density; global bandwidth; KullbackLeibler information; marginal likelihood; Markov chain Monte Carlo; S&P500 index 
JEL:  C11 C14 C15 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201021&r=ecm 
By:  Mikko Packalen (Department of Economics, University of Waterloo); Tony Wirjanto (School of Accounting & Finance and Department of Statistics and Actuarial Science, University of Waterloo) 
Abstract:  Selecting an estimator for the variance covariance matrix is an important step in hypothesis testing. From less robust to more robust, the available choices include: Eicker/White heteroskedasticityrobust standard errors, Newey and West heteroskedasticityandautocorrelation robust standard errors, and clusterrobust standard errors. The rationale for using a less robust covariance matrix estimator is that tests conducted using a less robust covariance matrix estimator can have better power properties. This motivates tests that examine the appropriate level of robustness in covariance matrix estimation. We propose a new robustness testing strategy, and show that it can dramatically improve inference about the proper level of robustness in covariance matrix estimation. Our main focus is on inference about clustering although the proposed robustness testing strategy can also improve inference about parametric assumptions in covariance matrix estimation, which we demonstrate for the case of testing for heteroskedasticity. We also show why the existing clustering test and other applications of the White (1980) robustness testing approach perform poorly, which to our knowledge has not been well understood. The insight into why this existing testing approach performs poorly is also the basis for the proposed robustness testing strategy. 
JEL:  C10 C12 C13 C52 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:wat:wpaper:1012&r=ecm 
By:  Boubacar Mainassara, Yacouba; Carbon, Michel; Francq, Christian 
Abstract:  Numerous time series admit "weak" autoregressivemoving average (ARMA) representations, in which the errors are uncorrelated but not necessarily independent nor martingale differences. The statistical inference of this general class of models requires the estimation of generalized Fisher information matrices. We give analytic expressions and propose consistent estimators of these matrices, at any point of the parameter space. Our results are illustrated by means of Monte Carlo experiments and by analyzing the dynamics of daily returns and squared daily returns of financial series. 
Keywords:  Asymptotic relative efficiency (ARE); Bahadur's slope; Information matrices; Lagrange Multiplier test; Nonlinear processes; Wald test; Weak ARMA models 
JEL:  C13 C12 C22 C01 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27685&r=ecm 
By:  Erniel B. Barrios; Rouselle F. Lavado (Philippine Institute for Development Studies) 
Abstract:  The stochastic frontier model with heterogeneous technical efficiency explained by exogenous variables is augmented with a sparse spatial autoregressive component for a crosssection data, and a spatialtemporal component for a panel data. An estimation procedure that takes advantage of the additivity of the model is proposed, computational advantages over simultaneous maximum likelihood estimation of all parameters is exhibited. The technical efficiency estimates are comparable to existing models and estimation procedures based on maximum likelihood methods. A spatial or spatialtemporal component can improve estimates of technical efficiency in a production frontier that is usually biased downwards. 
Keywords:  stochastic frontier models, technical efficiency, spatial externalities, spatialtemporal model, backfitting 
JEL:  C01 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:eab:microe:2434&r=ecm 
By:  Jian Huang; Masahito Kobayashi; Michael McAleer (University of Canterbury) 
Abstract:  This paper analyses the constant elasticity of volatility (CEV) model suggested by Chan et al. (1992). The CEV model without mean reversion is shown to be the inverse BoxCox transformation of integrated processes asymptotically. It is demonstrated that the maximum likelihood estimator of the power parameter has a nonstandard asymptotic distribution, which is expressed as an integral of Brownian motions, when the data generating process is not mean reverting. However, it is shown that the tratio follows a standard normal distribution asymptotically, so that the use of the conventional ttest in analyzing the power parameter of the CEV model is justified even if there is no mean reversion, as is often the case in empirical research. The model may applied to ultra high frequency data. 
Keywords:  BoxCox transformation; Brownian Motion; Constant Elasticity of Volatility; Mean Reversion; Nonstandard distribution 
Date:  2010–12–01 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:10/77&r=ecm 
By:  PengHsuan Ke (Institute of Economics, Academia Sinica, Taipei, Taiwan); WenJen Tsay (Institute of Economics, Academia Sinica, Taipei, Taiwan) 
Abstract:  This paper derives an analytic formula for the likelihood function of the true random effects stochastic frontier model of Greene (2005) with a time span T = 2. Numericalintegral procedure and simulationbased procedure is not required for the closedform approach. Combining the analytic formula and a pairwise likelihood estimator (PLE), we easily can estimate the random effects stochastic frontier models with T > 2. The simulations confirm the promising performance of the analytic methodology under various configurations of datagenerating processes considered in this paper. The proposed method is applied to the World Health Organization’s (WHO) panel data on national health care systems. 
Keywords:  Random effects, panel stochastic frontier model 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:sin:wpaper:10a007&r=ecm 
By:  Dominique Guegan (Centre d'Economie de la Sorbonne  Paris School of Economics); Philippe de Peretti (Centre d'Economie de la Sorbonne) 
Abstract:  In this paper, we present a procedure that tests for the null of timehomogeneity of the first two moments of a timeseries. Whereas the literature dedicated to structural breaks testing procedures often focuses on one kind of alternative, i.e. discrete shifts or smooth transition, our procedure is designed to deal with a broader alternative including i) discrete shifts, ii) smooth transition, iii) timevarying moments, iv) probabilitydriven breaks, v) GARCH or Stochastic Volatility Models for the variance. Our test uses the recently introduced maximum entropy bootstrap, designed to capture both timedependency and timeheterogeneity. Running simulations, our procedure appears to be quite powerful. To some extent, our paper is an extension of Heracleous, Koutris and Spanos (2008). 
Keywords:  Test, timehomogeneity, maximum entropy bootstrap. 
JEL:  C01 C12 C15 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:10098&r=ecm 
By:  Tetsuya Takaishi 
Abstract:  A Bayesian estimation of a GARCH model is performed for US Dollar/Japanese Yen exchange rate by the MetropolisHastings algorithm with a proposal density given by the adaptive construction scheme. In the adaptive construction scheme the proposal density is assumed to take a form of a multivariate Student's tdistribution and its parameters are evaluated by using the sampled data and updated adaptively during Markov Chain Monte Carlo simulations. We find that the autocorrelation times between the data sampled by the adaptive construction scheme are considerably reduced. %a factor of 100 smaller than those by the conventional Metropolis method. We conclude that the adaptive construction scheme works efficiently for the Bayesian inference of the GARCH model. 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1012.5986&r=ecm 
By:  Sermin Gungor; Richard Luger 
Abstract:  We develop a finitesample procedure to test the betapricing representation of linear factor pricing models that is applicable even if the number of test assets is greater than the length of the time series. Our distributionfree framework leaves open the possibility of unknown forms of nonnormalities, heteroskedasticity, timevarying correlations, and even outliers in the asset returns. The power of the proposed test procedure increases as the timeseries lengthens and/or the crosssection becomes larger. This stands in sharp contrast to the usual tests that lose power or may not even be computable if the crosssection is too large. Finally, we revisit the CAPM and the FamaFrench three factor model. Our results strongly support the meanvariance efficiency of the market portfolio. 
Keywords:  Econometric and statistical methods; Financial markets 
JEL:  C12 C14 C33 G11 G12 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:1036&r=ecm 
By:  Carbon, Michel; Francq, Christian 
Abstract:  The asymptotic distribution of a vector of autocorrelations of squared residuals is derived for a wide class of asymmetric GARCH models. Portmanteau adequacy tests are deduced. %gathered These results are obtained under moment assumptions on the iid process, but fat tails are allowed for the observed process, which is particularly relevant for series of financial returns. A Monte Carlo experiment and an illustration to financial series are also presented. 
Keywords:  ARCH models; Leverage effect; Portmanteau test; Goodnessoffit test; Diagnostic checking 
JEL:  C12 C22 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27686&r=ecm 
By:  Michele Costa (Università di Bologna); Luca De angelis (Università di Bologna) 
Abstract:  A review of model selection procedures in hidden Markov models reveals contrasting evidence about the reliability and the precision of the most commonly used methods. In order to evaluate and compare existing proposals, we develop a Monte Carlo experiment which allows a powerful insight on the behaviour of the most widespread model selection methods. We find that the number of observations, the conditional statedependent probabilities, and the latent transition matrix are the main factors influencing information criteria and likelihood ratio test results. We also find evidence that, for shorter univariate time series, AIC strongly outperforms BIC. 
Keywords:  Model selection procedure, Hidden Markov model, Monte Carlo experiment, information criteria, likelihood ratio test. Selezione del modello, Modello markoviano latente, Esperimento Monte Carlo, Criterio di informazione, Test del rapporto di verosimiglianza 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:bot:quadip:104&r=ecm 
By:  Felix Chan; Michael McAleer (University of Canterbury); Marcelo C. Medeiros 
Abstract:  Nonlinear time series models, especially those with regimeswitching and conditionally heteroskedastic errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper, we first derive necessary conditions for strict stationarity and ergodicity of three different specifications of the firstorder smooth transition autoregressions with heteroskedastic errors. This is important, among other reasons, to establish the conditions under which the traditional LMlinearity tests based on Taylor expansions are valid. Second, we provide sufficient conditions for consistency and asymptotic normality of the Quasi Maximum Likelihood Estimator for a general nonlinear conditional mean model with firstorder GARCH errors. 
Keywords:  Nonlinear time series; regimeswitching; smooth transition; STAR; GARCH; logmoment; moment conditions; asymptotic theory 
JEL:  E43 Q11 Q13 
Date:  2010–12–01 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:10/79&r=ecm 
By:  Mikko Packalen (Department of Economics, University of Waterloo) 
Abstract:  This paper presents a new method for estimating social interaction effects. The proposed approach is based on using network interaction structure induced variation in equilibrium influence to construct conditionally balanced interaction structures. As equilibrium influence is determined by the known interaction structure and the unknown endogenous social interaction parameter, interaction structures are constructed for different imputed values of the unknown parameter. Each constructed interaction structure is conditionally balanced in the sense that when it is combined with observations on the outcome variable to construct a new variable, the constructed variable is a valid instrumental variable for the endogenous social interaction regressor if the true and imputed parameter values are the same. Comparison of each imputed value with the associated instrumental variable estimate thus yields a confidence set estimate for the endogenous social interaction parameter as well as for other model parameters. We provide conditions for point identification and partial identification. The contrast between the proposed and existing approaches is stark. In the existing approach instruments are constructed from observations on exogenous variables, whereas in the proposed approach instruments are constructed from observations on the outcome variable. Both approaches have advantages, and the two approaches complement one another. We demonstrate the feasibility of the proposed approach with analyses of the determinants of subjective college completion and income expectations among adolescents in the Add Health data and with Monte Carlo simulations of ErdösRényi and smallworld networks. 
JEL:  C31 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:wat:wpaper:1013&r=ecm 
By:  Clements, Michael P. (University of Warwick); Galvão, Ana Beatriz (Queen Mary University of London) 
Abstract:  We show how to improve the accuracy of realtime forecasts from models that include autoregressive terms by estimating the models on ‘lightlyrevised’data instead of using data from the latestavailable vintage. Forecast accuracy is improved by reorganizing the data vintages employed in the estimation of the model in such a way that the vintages used in estimation are of a similar maturity to the data in the forecast loss function. The size of the expected reductions in mean squared error depend on the characteristics of the data revision process. Empirically, we …nd RMSFE gains of 24% when forecasting output growth and in‡ation with AR models, and gains of the order of 8% with ADL models. 
Keywords:  realtime data ; news and noise revisions ; optimal forecasts ; multivintage models. JEL Classification: C53 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:wrk:warwec:953&r=ecm 
By:  Calhoun, Gray 
Abstract:  This paper derives the asymptotic distribution of the Ftest for the significance of linear regression coefficients as both the number of regressors, k, and the number of observations, n, increase together so that their ratio remains positive in the limit. The conventional critical values for this test statistic are too small, and the standard version of the Ftest is invalid under this asymptotic theory. This paper provides a correction to the F statistic that gives correctlysized tests under both this paper's limit theory and also under conventional asymptotic theory that keeps k finite. This paper also presents simulations that indicate the new statistic can perform better in small samples than the conventional test. The statistic is then used to reexamine Olivei and Tenreyro's results from "The Timing of Monetary Policy Shocks" (2007, AER) and SalaiMartin's results from "I Just Ran Two Million Regressions" (1997, AER). 
Keywords:  Dimension Asymptotics; FTest; Ordinary Least Squares 
JEL:  C12 C20 
Date:  2010–12–20 
URL:  http://d.repec.org/n?u=RePEc:isu:genres:32216&r=ecm 
By:  Antipov, Evgeny; Pokryshevskaya, Elena 
Abstract:  In this paper an approach for automatic detection of segments where a regression model significantly underperforms and for detecting segments with systematically under or overestimated prediction is introduced. This segmentational approach is applicable to various expert systems including, but not limited to, those used for the mass appraisal. The proposed approach may be useful for various regression analysis applications, especially those with strong heteroscedasticity. It helps to reveal segments for which separate models or appraiser assistance are desirable. The segmentational approach has been applied to a mass appraisal model based on the Random Forest algorithm. 
Keywords:  CART; model diagnostics; mass appraisal; real estate; Random forest; heteroscedasticity 
JEL:  C45 L85 C4 
Date:  2010–12–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27646&r=ecm 
By:  Worapree Maneesoonthorn; Gael M. Martin; Catherine S. Forbes; Simone Grose 
Abstract:  The object of this paper is to produce distributional forecasts of physical volatility and its associated risk premia using a nonGaussian, nonlinear state space approach. Option and spot market information on the unobserved variance process is captured by using dual 'modelfree' variance measures to define a bivariate observation equation in the state space model. The premium for diffusive variance risk is defined as linear in the latent variance (in the usual fashion) whilst the premium for jump variance risk is specified as a conditionally deterministic dynamic process, driven by a function of past measurements. The inferential approach adopted is Bayesian, implemented via a Markov chain Monte Carlo algorithm that caters for the multiple sources of nonlinearity in the model and the bivariate measure. The method is applied to empirical spot and option price data for the S&P500 index over the 1999 to 2008 period, with conclusions drawn about investors' required compensation for variance risk during the recent financial turmoil. The accuracy of the probabilistic forecasts of the observable variance measures is demonstrated, and compared with that of forecasts yielded by more standard time series models. To illustrate the benefits of the approach, the posterior distribution is augmented by information on daily returns to produce Value at Risk predictions, as well as being used to yield forecasts of the prices of derivatives on volatility itself. Linking the variance risk premia to the risk aversion parameter in a representative agent model, probabilistic forecasts of relative risk aversion are also produced. 
Keywords:  Volatility Forecasting; Nonlinear State Space Models; Nonparametric Variance Measures; Bayesian Markov Chain Monte Carlo; VIX Futures; Risk Aversion. 
JEL:  C11 C53 
Date:  2010–12–20 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201022&r=ecm 
By:  Simar, Léopold; Vanhems, Anne 
Abstract:  In productivity analysis, the performance of production units is measured through the distance of the individual decision making units (DMU) to the technology which is defined as the frontier of the production set. Most of the existing methods, FarrellDebreu and Shephard radial measures (input or output oriented) and hyperbolic distance functions, rely on multiplicative measures of the distance and so require to deal with strictly positive inputs and outputs. This can be critical when the data contain zero or negative values as in financial data bases for the measure of funds performances. Directional distance function is an alternative that can be viewed as an additive measure of efficiency. We show in this paper that using a probabilistic formulation of the production process, the directional distance can be expressed as simple radial or hyperbolic distance up to a simple transformation of the inputs/outputs space. This allows to propose simple methods of estimation but also to transfer easily most of the known properties of the estimators shared by the radial and hyperbolic distances. In addition, the formulation allows to define robust directional distances in the lines of alphaquantile or orderm partial frontiers. Finally we can also define conditional directional distance functions, conditional to environmental factors. To illustrate the methodology, we show how it can be implemented using a Mutual Funds database. 
Keywords:  Directional distance function; partial frontier; conditional measures of efficiency 
JEL:  C13 C14 
Date:  2010–09–30 
URL:  http://d.repec.org/n?u=RePEc:tse:wpaper:23435&r=ecm 
By:  Antipov, Evgeny; Pokryshevskaya, Elena 
Abstract:  To the best knowledge of authors, the use of Random forest as a potential technique for residential estate mass appraisal has been attempted for the first time. In the empirical study using data on residential apartments the method performed better than such techniques as CHAID, CART, KNN, multiple regression analysis, Artificial Neural Networks (MLP and RBF) and Boosted Trees. An approach for automatic detection of segments where a model significantly underperforms and for detecting segments with systematically under or overestimated prediction is introduced. This segmentational approach is applicable to various expert systems including, but not limited to, those used for the mass appraisal. 
Keywords:  Random forest; mass appraisal; CART; model diagnostics; real estate; automatic valuation model 
JEL:  C14 C45 L85 
Date:  2010–07–29 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27645&r=ecm 
By:  Hyeongwoo Kim; YoungKyu Moh 
Abstract:  This paper revisits the empirical evidence of purchasing power parity under the current float by recursive mean adjustment (RMA) proposed by So and Shin (1999). We first report superior power of the RMAbased unit root test in finite samples relative to the conventional augmented DickeyFuller (ADF) test via Monte Carlo experiments for 16 linear and nonlinear autoregressive data generating processes. We find that the more powerful RMAbased unit root test rejects the null hypothesis of a unit root for 16 out of 20 current float real exchange rates relative to the US dollar, while the ADF test rejects only 5 at the 10% significance level. We also find that the computationally simple RMAbased asymptotic confidence interval can provide useful information regarding the halflife of the real exchange rate. 
Keywords:  Recursive Mean Adjustment, Finite Sample Performance, Purchasing Power Parity, HalfLife 
JEL:  C12 C22 F31 
Date:  2010–12 
URL:  http://d.repec.org/n?u=RePEc:abn:wpaper:auwp201008&r=ecm 
By:  Christian Kleiber; Achim Zeileis (University of Basel) 
JEL:  C C C 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:bsl:wpaper:11/10&r=ecm 
By:  Adriana Di Liberto; Stefano Usai 
Abstract:  This paper proposes a fixedeffect panel methodology that enables us to simultaneously take into account both TFP and traditional neoclassical convergence. We analyse a sample of 199 regions in EU15 (plus Norway and Switzerland) between 1985 and 2006 and find the absence of an overall process of TFP convergence as we observe that TFP dispersion is virtually constant across the two subperiods. This result is proved robust to the use of different estimation procedures such as simple LSDV , spatially corrected LSDV , Kivietcorrected LSDV, and GMM à la Arellano and Bond. However, we also show that this absence of a strong process of global TFP convergence hides interesting dynamic patterns across regions. These patterns are revealed by the use of recent exploratory spatial data techniques that enable us to obtain a complete picture of the complex EU crossregions dynamics. We find that, between 1985 and 2006, there has been numerous regional miracles and disasters in terms of TFP performance and that polarization patterns have significantly changed along time. Overall, results seem to suggest that a few TFP leaders are emerging and are distancing themselves from the rest, while the cluster of low TFP regions is increasing. 
Keywords:  TFP; technology catching up; panel data; exploratory spatial data analysis 
JEL:  C23 O33 O47 R11 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:cns:cnscwp:201030&r=ecm 
By:  Manuel CanoRodríguez; Manuel NúñezNickel 
Abstract:  In this paper, we propose an econometric model that presents three advantages in relation to the Basu model: (1) it is robust to the aggregation problem; that is, we prove that the Basu model produces inconsistent estimations of conditional conservatism and that this problem is solved with our proposal; (2) it can produce firmspecific measures of conservatism by using timeseries; and (3) it completes the understanding of the intercept in the Basu model by breaking it down between unconditional conservatism and the reversion of the differences between market and book values of equity. In other words, we can provide firmspecific measures of both conditional and unconditional conservatism with the same model. We demonstrate all these theoretical assertions using simulated data 
Keywords:  Accounting conservatism, Conditional conservatism, Unconditional conservatism, The Basu model, Aggregation effect 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:cte:idrepe:id1007&r=ecm 
By:  Hachicha, Wafik; Ammeri, ahmed; Masmoudi, Faouzi; Chachoub, Habib 
Abstract:  Simulation Optimization (SO) provides a structured approach to the system design and configuration when analytical expressions for input/output relationships are unavailable. Several excellent surveys have been written on this topic. Each survey concentrates on only few classification criteria. This paper presents a literature survey with all classification criteria on techniques for SO according to the problem of characteristics such as shape of the response surface (global as compared to local optimization), objective functions (single or multiple objectives) and parameter spaces (discrete or continuous parameters). The survey focuses specifically on the SO problem that involves single performance measure 
Keywords:  Simulation Optimization; classification methods; literature survey 
JEL:  C44 C61 C15 Z11 
Date:  2010–05–24 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:27652&r=ecm 