nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒01‒03
25 papers chosen by
Sune Karlsson
Orebro University

  1. Non-Parametric Maximum Likelihood Density Estimation and Simulation-Based Minimum Distance Estimators By Gach, Florian; Pötscher, Benedikt M.
  2. Bayesian Adaptive Bandwidth Kernel Density Estimation of Irregular Multivariate Distributions By Shuowen Hu; D.S. Poskitt; Xibin Zhang
  3. Inference about Clustering and Parametric Assumptions in Covariance Matrix Estimation By Mikko Packalen; Tony Wirjanto
  4. Computing and estimating information matrices of weak arma models By Boubacar Mainassara, Yacouba; Carbon, Michel; Francq, Christian
  5. Spatial Stochastic Frontier Models By Erniel B. Barrios; Rouselle F. Lavado
  6. Testing the Box-Cox Parameter for an Integrated Process By Jian Huang; Masahito Kobayashi; Michael McAleer
  7. A Simple Analytic Procedure for Estimating the True Random Effects Stochastic Frontier Model By Peng-Hsuan Ke; Wen-Jen Tsay
  8. An omnibus test to detect time-heterogeneity in time series. By Dominique Guegan; Philippe de Peretti
  9. Bayesian estimation of GARCH model with an adaptive proposal density By Tetsuya Takaishi
  10. Bank Testing Linear Factor Pricing Models with Large Cross-Sections: A Distribution-Free Approach By Sermin Gungor; Richard Luger
  11. Portmanteau goodness-of-fit test for asymmetric power GARCH models By Carbon, Michel; Francq, Christian
  12. Model selection in hidden Markov models : a simulation study By Michele Costa; Luca De angelis
  13. Structure and Asymptotic Theory for Nonlinear Models with GARCH Errors By Felix Chan; Michael McAleer; Marcelo C. Medeiros
  14. Identification and Estimation of Social Interactions through Variation in Equilibrium Influence By Mikko Packalen
  15. Real-time Forecasting of Inflation and Output Growth in the Presence of Data Revisions By Clements, Michael P.; Galvão, Ana Beatriz
  16. Hypothesis Testing in Linear Regression when K/N is Large By Calhoun, Gray
  17. Applying a CART-based approach for the diagnostics of mass appraisal models By Antipov, Evgeny; Pokryshevskaya, Elena
  18. Probabilistic Forecasts of Volatility and its Risk Premia By Worapree Maneesoonthorn; Gael M. Martin; Catherine S. Forbes; Simone Grose
  19. Probabilistic Characterization of Directional Distances and their Robust Versions By Simar, Léopold; Vanhems, Anne
  20. Mass appraisal of residential apartments: An application of Random forest for valuation and a CART-based approach for model diagnostics By Antipov, Evgeny; Pokryshevskaya, Elena
  21. Examining the Evidence of Purchasing Power Parity by Recursive Mean Adjustment By Hyeongwoo Kim; Young-Kyu Moh
  22. Reproducible Econometric Simulations By Christian Kleiber; Achim Zeileis
  23. TFP convergence across European regions: a comparative spatial dynamics analysis By Adriana Di Liberto; Stefano Usai
  24. Consistent estimation of conditional conservatism By Manuel Cano-Rodríguez; Manuel Núñez-Nickel
  25. A comprehensive literature classification of simulation optimisation methods By Hachicha, Wafik; Ammeri, ahmed; Masmoudi, Faouzi; Chachoub, Habib

  1. By: Gach, Florian; Pötscher, Benedikt M.
    Abstract: Indirect inference estimators (i.e., simulation-based minimum distance estimators) in a parametric model that are based on auxiliary non-parametric maximum likelihood density estimators are shown to be asymptotically normal. If the parametric model is correctly specified, it is furthermore shown that the asymptotic variance-covariance matrix equals the Cramér-Rao bound. These results are based on uniform-in-parameters convergence rates and a uniform-in-parameters Donsker-type theorem for non-parametric maximum likelihood density estimators.
    Keywords: Indirect inference; simulation-based minimum distance estimation; non-parametric maximum likelihood; density estimation; efficiency
    JEL: C13 C14 C15
    Date: 2010–12–16
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:27512&r=ecm
  2. By: Shuowen Hu; D.S. Poskitt; Xibin Zhang
    Abstract: Kernel density estimation is an important technique for understanding the distributional properties of data. Some investigations have found that the estimation of a global bandwidth can be heavily affected by observations in the tail. We propose to categorize data into low- and high-density regions, to which we assign two different bandwidths called the low-density adaptive bandwidths. We derive the posterior of the bandwidth parameters through the Kullback-Leibler information. A Bayesian sampling algorithm is presented to estimate the bandwidths. Monte Carlo simulations are conducted to examine the performance of the proposed Bayesian sampling algorithm in comparison with the performance of the normal reference rule and a Bayesian sampling algorithm for estimating a global bandwidth. According to Kullback-Leibler information, the kernel density estimator with low-density adaptive bandwidths estimated through the proposed Bayesian sampling algorithm outperforms the density estimators with bandwidth estimated through the two competitors. We apply the low-density adaptive kernel density estimator to the estimation of the bivariate density of daily stock-index returns observed from the U.S. and Australian stock markets. The derived conditional distribution of the Australian stock-index return for a given daily return in the U.S. market enables market analysts to understand how the former market is associated with the latter.
    Keywords: conditional density; global bandwidth; Kullback-Leibler information; marginal likelihood; Markov chain Monte Carlo; S&P500 index
    JEL: C11 C14 C15
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2010-21&r=ecm
  3. By: Mikko Packalen (Department of Economics, University of Waterloo); Tony Wirjanto (School of Accounting & Finance and Department of Statistics and Actuarial Science, University of Waterloo)
    Abstract: Selecting an estimator for the variance covariance matrix is an important step in hypothesis testing. From less robust to more robust, the available choices include: Eicker/White heteroskedasticity-robust standard errors, Newey and West heteroskedasticity-and-autocorrelation- robust standard errors, and cluster-robust standard errors. The rationale for using a less robust covariance matrix estimator is that tests conducted using a less robust covariance matrix estimator can have better power properties. This motivates tests that examine the appropriate level of robustness in covariance matrix estimation. We propose a new robustness testing strategy, and show that it can dramatically improve inference about the proper level of robustness in covariance matrix estimation. Our main focus is on inference about clustering although the proposed robustness testing strategy can also improve inference about parametric assumptions in covariance matrix estimation, which we demonstrate for the case of testing for heteroskedasticity. We also show why the existing clustering test and other applications of the White (1980) robustness testing approach perform poorly, which to our knowledge has not been well understood. The insight into why this existing testing approach performs poorly is also the basis for the proposed robustness testing strategy.
    JEL: C10 C12 C13 C52
    Date: 2010–11
    URL: http://d.repec.org/n?u=RePEc:wat:wpaper:1012&r=ecm
  4. By: Boubacar Mainassara, Yacouba; Carbon, Michel; Francq, Christian
    Abstract: Numerous time series admit "weak" autoregressive-moving average (ARMA) representations, in which the errors are uncorrelated but not necessarily independent nor martingale differences. The statistical inference of this general class of models requires the estimation of generalized Fisher information matrices. We give analytic expressions and propose consistent estimators of these matrices, at any point of the parameter space. Our results are illustrated by means of Monte Carlo experiments and by analyzing the dynamics of daily returns and squared daily returns of financial series.
    Keywords: Asymptotic relative efficiency (ARE); Bahadur's slope; Information matrices; Lagrange Multiplier test; Nonlinear processes; Wald test; Weak ARMA models
    JEL: C13 C12 C22 C01
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:27685&r=ecm
  5. By: Erniel B. Barrios; Rouselle F. Lavado (Philippine Institute for Development Studies)
    Abstract: The stochastic frontier model with heterogeneous technical efficiency explained by exogenous variables is augmented with a sparse spatial autoregressive component for a cross-section data, and a spatial-temporal component for a panel data. An estimation procedure that takes advantage of the additivity of the model is proposed, computational advantages over simultaneous maximum likelihood estimation of all parameters is exhibited. The technical efficiency estimates are comparable to existing models and estimation procedures based on maximum likelihood methods. A spatial or spatial-temporal component can improve estimates of technical efficiency in a production frontier that is usually biased downwards.
    Keywords: stochastic frontier models, technical efficiency, spatial externalities, spatial-temporal model, backfitting
    JEL: C01
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:eab:microe:2434&r=ecm
  6. By: Jian Huang; Masahito Kobayashi; Michael McAleer (University of Canterbury)
    Abstract: This paper analyses the constant elasticity of volatility (CEV) model suggested by Chan et al. (1992). The CEV model without mean reversion is shown to be the inverse Box-Cox transformation of integrated processes asymptotically. It is demonstrated that the maximum likelihood estimator of the power parameter has a nonstandard asymptotic distribution, which is expressed as an integral of Brownian motions, when the data generating process is not mean reverting. However, it is shown that the t-ratio follows a standard normal distribution asymptotically, so that the use of the conventional t-test in analyzing the power parameter of the CEV model is justified even if there is no mean reversion, as is often the case in empirical research. The model may applied to ultra high frequency data.
    Keywords: Box-Cox transformation; Brownian Motion; Constant Elasticity of Volatility; Mean Reversion; Nonstandard distribution
    Date: 2010–12–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:10/77&r=ecm
  7. By: Peng-Hsuan Ke (Institute of Economics, Academia Sinica, Taipei, Taiwan); Wen-Jen Tsay (Institute of Economics, Academia Sinica, Taipei, Taiwan)
    Abstract: This paper derives an analytic formula for the likelihood function of the true random effects stochastic frontier model of Greene (2005) with a time span T = 2. Numerical-integral procedure and simulation-based procedure is not required for the closed-form approach. Combining the analytic formula and a pairwise likelihood estimator (PLE), we easily can estimate the random effects stochastic frontier models with T > 2. The simulations confirm the promising performance of the analytic methodology under various configurations of data-generating processes considered in this paper. The proposed method is applied to the World Health Organization’s (WHO) panel data on national health care systems.
    Keywords: Random effects, panel stochastic frontier model
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:sin:wpaper:10-a007&r=ecm
  8. By: Dominique Guegan (Centre d'Economie de la Sorbonne - Paris School of Economics); Philippe de Peretti (Centre d'Economie de la Sorbonne)
    Abstract: In this paper, we present a procedure that tests for the null of time-homogeneity of the first two moments of a time-series. Whereas the literature dedicated to structural breaks testing procedures often focuses on one kind of alternative, i.e. discrete shifts or smooth transition, our procedure is designed to deal with a broader alternative including i) discrete shifts, ii) smooth transition, iii) time-varying moments, iv) probability-driven breaks, v) GARCH or Stochastic Volatility Models for the variance. Our test uses the recently introduced maximum entropy bootstrap, designed to capture both time-dependency and time-heterogeneity. Running simulations, our procedure appears to be quite powerful. To some extent, our paper is an extension of Heracleous, Koutris and Spanos (2008).
    Keywords: Test, time-homogeneity, maximum entropy bootstrap.
    JEL: C01 C12 C15
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:10098&r=ecm
  9. By: Tetsuya Takaishi
    Abstract: A Bayesian estimation of a GARCH model is performed for US Dollar/Japanese Yen exchange rate by the Metropolis-Hastings algorithm with a proposal density given by the adaptive construction scheme. In the adaptive construction scheme the proposal density is assumed to take a form of a multivariate Student's t-distribution and its parameters are evaluated by using the sampled data and updated adaptively during Markov Chain Monte Carlo simulations. We find that the autocorrelation times between the data sampled by the adaptive construction scheme are considerably reduced. %a factor of 100 smaller than those by the conventional Metropolis method. We conclude that the adaptive construction scheme works efficiently for the Bayesian inference of the GARCH model.
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1012.5986&r=ecm
  10. By: Sermin Gungor; Richard Luger
    Abstract: We develop a finite-sample procedure to test the beta-pricing representation of linear factor pricing models that is applicable even if the number of test assets is greater than the length of the time series. Our distribution-free framework leaves open the possibility of unknown forms of non-normalities, heteroskedasticity, time-varying correlations, and even outliers in the asset returns. The power of the proposed test procedure increases as the time-series lengthens and/or the cross-section becomes larger. This stands in sharp contrast to the usual tests that lose power or may not even be computable if the cross-section is too large. Finally, we revisit the CAPM and the Fama-French three factor model. Our results strongly support the mean-variance efficiency of the market portfolio.
    Keywords: Econometric and statistical methods; Financial markets
    JEL: C12 C14 C33 G11 G12
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:10-36&r=ecm
  11. By: Carbon, Michel; Francq, Christian
    Abstract: The asymptotic distribution of a vector of autocorrelations of squared residuals is derived for a wide class of asymmetric GARCH models. Portmanteau adequacy tests are deduced. %gathered These results are obtained under moment assumptions on the iid process, but fat tails are allowed for the observed process, which is particularly relevant for series of financial returns. A Monte Carlo experiment and an illustration to financial series are also presented.
    Keywords: ARCH models; Leverage effect; Portmanteau test; Goodness-of-fit test; Diagnostic checking
    JEL: C12 C22
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:27686&r=ecm
  12. By: Michele Costa (Università di Bologna); Luca De angelis (Università di Bologna)
    Abstract: A review of model selection procedures in hidden Markov models reveals contrasting evidence about the reliability and the precision of the most commonly used methods. In order to evaluate and compare existing proposals, we develop a Monte Carlo experiment which allows a powerful insight on the behaviour of the most widespread model selection methods. We find that the number of observations, the conditional state-dependent probabilities, and the latent transition matrix are the main factors influencing information criteria and likelihood ratio test results. We also find evidence that, for shorter univariate time series, AIC strongly outperforms BIC.
    Keywords: Model selection procedure, Hidden Markov model, Monte Carlo experiment, information criteria, likelihood ratio test. Selezione del modello, Modello markoviano latente, Esperimento Monte Carlo, Criterio di informazione, Test del rapporto di verosimiglianza
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:bot:quadip:104&r=ecm
  13. By: Felix Chan; Michael McAleer (University of Canterbury); Marcelo C. Medeiros
    Abstract: Nonlinear time series models, especially those with regime-switching and conditionally heteroskedastic errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper, we first derive necessary conditions for strict stationarity and ergodicity of three different specifications of the first-order smooth transition autoregressions with heteroskedastic errors. This is important, among other reasons, to establish the conditions under which the traditional LMlinearity tests based on Taylor expansions are valid. Second, we provide sufficient conditions for consistency and asymptotic normality of the Quasi- Maximum Likelihood Estimator for a general nonlinear conditional mean model with first-order GARCH errors.
    Keywords: Nonlinear time series; regime-switching; smooth transition; STAR; GARCH; log-moment; moment conditions; asymptotic theory
    JEL: E43 Q11 Q13
    Date: 2010–12–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:10/79&r=ecm
  14. By: Mikko Packalen (Department of Economics, University of Waterloo)
    Abstract: This paper presents a new method for estimating social interaction effects. The proposed approach is based on using network interaction structure induced variation in equilibrium influence to construct conditionally balanced interaction structures. As equilibrium influence is determined by the known interaction structure and the unknown endogenous social interaction parameter, interaction structures are constructed for different imputed values of the unknown parameter. Each constructed interaction structure is conditionally balanced in the sense that when it is combined with observations on the outcome variable to construct a new variable, the constructed variable is a valid instrumental variable for the endogenous social interaction regressor if the true and imputed parameter values are the same. Comparison of each imputed value with the associated instrumental variable estimate thus yields a confidence set estimate for the endogenous social interaction parameter as well as for other model parameters. We provide conditions for point identification and partial identification. The contrast between the proposed and existing approaches is stark. In the existing approach instruments are constructed from observations on exogenous variables, whereas in the proposed approach instruments are constructed from observations on the outcome variable. Both approaches have advantages, and the two approaches complement one another. We demonstrate the feasibility of the proposed approach with analyses of the determinants of subjective college completion and income expectations among adolescents in the Add Health data and with Monte Carlo simulations of Erdös-Rényi and small-world networks.
    JEL: C31
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:wat:wpaper:1013&r=ecm
  15. By: Clements, Michael P. (University of Warwick); Galvão, Ana Beatriz (Queen Mary University of London)
    Abstract: We show how to improve the accuracy of real-time forecasts from models that include au-toregressive terms by estimating the models on ‘lightly-revised’data instead of using data from the latest-available vintage. Forecast accuracy is improved by reorganizing the data vintages employed in the estimation of the model in such a way that the vintages used in estimation are of a similar maturity to the data in the forecast loss function. The size of the expected reductions in mean squared error depend on the characteristics of the data revision process. Empirically, we …nd RMSFE gains of 2-4% when forecasting output growth and in‡ation with AR models, and gains of the order of 8% with ADL models.
    Keywords: real-time data ; news and noise revisions ; optimal forecasts ; multi-vintage models. JEL Classification: C53
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:wrk:warwec:953&r=ecm
  16. By: Calhoun, Gray
    Abstract: This paper derives the asymptotic distribution of the F-test for the significance of linear regression coefficients as both the number of regressors, k, and the number of observations, n, increase together so that their ratio remains positive in the limit. The conventional critical values for this test statistic are too small, and the standard version of the F-test is invalid under this asymptotic theory. This paper provides a correction to the F statistic that gives correctly-sized tests under both this paper's limit theory and also under conventional asymptotic theory that keeps k finite. This paper also presents simulations that indicate the new statistic can perform better in small samples than the conventional test. The statistic is then used to reexamine Olivei and Tenreyro's results from "The Timing of Monetary Policy Shocks" (2007, AER) and Sala-i-Martin's results from "I Just Ran Two Million Regressions" (1997, AER).
    Keywords: Dimension Asymptotics; F-Test; Ordinary Least Squares
    JEL: C12 C20
    Date: 2010–12–20
    URL: http://d.repec.org/n?u=RePEc:isu:genres:32216&r=ecm
  17. By: Antipov, Evgeny; Pokryshevskaya, Elena
    Abstract: In this paper an approach for automatic detection of segments where a regression model significantly underperforms and for detecting segments with systematically under- or overestimated prediction is introduced. This segmentational approach is applicable to various expert systems including, but not limited to, those used for the mass appraisal. The proposed approach may be useful for various regression analysis applications, especially those with strong heteroscedasticity. It helps to reveal segments for which separate models or appraiser assistance are desirable. The segmentational approach has been applied to a mass appraisal model based on the Random Forest algorithm.
    Keywords: CART; model diagnostics; mass appraisal; real estate; Random forest; heteroscedasticity
    JEL: C45 L85 C4
    Date: 2010–12–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:27646&r=ecm
  18. By: Worapree Maneesoonthorn; Gael M. Martin; Catherine S. Forbes; Simone Grose
    Abstract: The object of this paper is to produce distributional forecasts of physical volatility and its associated risk premia using a non-Gaussian, non-linear state space approach. Option and spot market information on the unobserved variance process is captured by using dual 'model-free' variance measures to define a bivariate observation equation in the state space model. The premium for diffusive variance risk is defined as linear in the latent variance (in the usual fashion) whilst the premium for jump variance risk is specified as a conditionally deterministic dynamic process, driven by a function of past measurements. The inferential approach adopted is Bayesian, implemented via a Markov chain Monte Carlo algorithm that caters for the multiple sources of non-linearity in the model and the bivariate measure. The method is applied to empirical spot and option price data for the S&P500 index over the 1999 to 2008 period, with conclusions drawn about investors' required compensation for variance risk during the recent financial turmoil. The accuracy of the probabilistic forecasts of the observable variance measures is demonstrated, and compared with that of forecasts yielded by more standard time series models. To illustrate the benefits of the approach, the posterior distribution is augmented by information on daily returns to produce Value at Risk predictions, as well as being used to yield forecasts of the prices of derivatives on volatility itself. Linking the variance risk premia to the risk aversion parameter in a representative agent model, probabilistic forecasts of relative risk aversion are also produced.
    Keywords: Volatility Forecasting; Non-linear State Space Models; Non-parametric Variance Measures; Bayesian Markov Chain Monte Carlo; VIX Futures; Risk Aversion.
    JEL: C11 C53
    Date: 2010–12–20
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2010-22&r=ecm
  19. By: Simar, Léopold; Vanhems, Anne
    Abstract: In productivity analysis, the performance of production units is measured through the distance of the individual decision making units (DMU) to the technology which is defined as the frontier of the production set. Most of the existing methods, Farrell-Debreu and Shephard radial measures (input or output oriented) and hyperbolic distance functions, rely on multiplicative measures of the distance and so require to deal with strictly positive inputs and outputs. This can be critical when the data contain zero or negative values as in financial data bases for the measure of funds performances. Directional distance function is an alternative that can be viewed as an additive measure of efficiency. We show in this paper that using a probabilistic formulation of the production process, the directional distance can be expressed as simple radial or hyperbolic distance up to a simple transformation of the inputs/outputs space. This allows to propose simple methods of estimation but also to transfer easily most of the known properties of the estimators shared by the radial and hyperbolic distances. In addition, the formulation allows to define robust directional distances in the lines of alpha-quantile or order-m partial frontiers. Finally we can also define conditional directional distance functions, conditional to environmental factors. To illustrate the methodology, we show how it can be implemented using a Mutual Funds database.
    Keywords: Directional distance function; partial frontier; conditional measures of efficiency
    JEL: C13 C14
    Date: 2010–09–30
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:23435&r=ecm
  20. By: Antipov, Evgeny; Pokryshevskaya, Elena
    Abstract: To the best knowledge of authors, the use of Random forest as a potential technique for residential estate mass appraisal has been attempted for the first time. In the empirical study using data on residential apartments the method performed better than such techniques as CHAID, CART, KNN, multiple regression analysis, Artificial Neural Networks (MLP and RBF) and Boosted Trees. An approach for automatic detection of segments where a model significantly underperforms and for detecting segments with systematically under- or overestimated prediction is introduced. This segmentational approach is applicable to various expert systems including, but not limited to, those used for the mass appraisal.
    Keywords: Random forest; mass appraisal; CART; model diagnostics; real estate; automatic valuation model
    JEL: C14 C45 L85
    Date: 2010–07–29
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:27645&r=ecm
  21. By: Hyeongwoo Kim; Young-Kyu Moh
    Abstract: This paper revisits the empirical evidence of purchasing power parity under the current float by recursive mean adjustment (RMA) proposed by So and Shin (1999). We first report superior power of the RMA-based unit root test in finite samples relative to the conventional augmented Dickey-Fuller (ADF) test via Monte Carlo experiments for 16 linear and nonlinear autoregressive data generating processes. We find that the more powerful RMA-based unit root test rejects the null hypothesis of a unit root for 16 out of 20 current float real exchange rates relative to the US dollar, while the ADF test rejects only 5 at the 10% significance level. We also find that the computationally simple RMA-based asymptotic confidence interval can provide useful information regarding the half-life of the real exchange rate.
    Keywords: Recursive Mean Adjustment, Finite Sample Performance, Purchasing Power Parity, Half-Life
    JEL: C12 C22 F31
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:abn:wpaper:auwp2010-08&r=ecm
  22. By: Christian Kleiber; Achim Zeileis (University of Basel)
    JEL: C C C
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:bsl:wpaper:11/10&r=ecm
  23. By: Adriana Di Liberto; Stefano Usai
    Abstract: This paper proposes a fixed-effect panel methodology that enables us to simultaneously take into account both TFP and traditional neoclassical convergence. We analyse a sample of 199 regions in EU15 (plus Norway and Switzerland) between 1985 and 2006 and find the absence of an overall process of TFP convergence as we observe that TFP dispersion is virtually constant across the two sub-periods. This result is proved robust to the use of different estimation procedures such as simple LSDV , spatially corrected LSDV , Kiviet-corrected LSDV, and GMM à la Arellano and Bond. However, we also show that this absence of a strong process of global TFP convergence hides interesting dynamic patterns across regions. These patterns are revealed by the use of recent exploratory spatial data techniques that enable us to obtain a complete picture of the complex EU cross-regions dynamics. We find that, between 1985 and 2006, there has been numerous regional miracles and disasters in terms of TFP performance and that polarization patterns have significantly changed along time. Overall, results seem to suggest that a few TFP leaders are emerging and are distancing themselves from the rest, while the cluster of low TFP regions is increasing.
    Keywords: TFP; technology catching up; panel data; exploratory spatial data analysis
    JEL: C23 O33 O47 R11
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:cns:cnscwp:201030&r=ecm
  24. By: Manuel Cano-Rodríguez; Manuel Núñez-Nickel
    Abstract: In this paper, we propose an econometric model that presents three advantages in relation to the Basu model: (1) it is robust to the aggregation problem; that is, we prove that the Basu model produces inconsistent estimations of conditional conservatism and that this problem is solved with our proposal; (2) it can produce firm-specific measures of conservatism by using time-series; and (3) it completes the understanding of the intercept in the Basu model by breaking it down between unconditional conservatism and the reversion of the differences between market and book values of equity. In other words, we can provide firm-specific measures of both conditional and unconditional conservatism with the same model. We demonstrate all these theoretical assertions using simulated data
    Keywords: Accounting conservatism, Conditional conservatism, Unconditional conservatism, The Basu model, Aggregation effect
    Date: 2010–11
    URL: http://d.repec.org/n?u=RePEc:cte:idrepe:id-10-07&r=ecm
  25. By: Hachicha, Wafik; Ammeri, ahmed; Masmoudi, Faouzi; Chachoub, Habib
    Abstract: Simulation Optimization (SO) provides a structured approach to the system design and configuration when analytical expressions for input/output relationships are unavailable. Several excellent surveys have been written on this topic. Each survey concentrates on only few classification criteria. This paper presents a literature survey with all classification criteria on techniques for SO according to the problem of characteristics such as shape of the response surface (global as compared to local optimization), objective functions (single or multiple objectives) and parameter spaces (discrete or continuous parameters). The survey focuses specifically on the SO problem that involves single per-formance measure
    Keywords: Simulation Optimization; classification methods; literature survey
    JEL: C44 C61 C15 Z11
    Date: 2010–05–24
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:27652&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.