nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒05‒16
34 papers chosen by
Sune Karlsson
Orebro University

  1. Estimating structural VARMA models with uncorrelated but non-independent error terms By Boubacar Mainassara, Yacouba; Francq, Christian
  2. Alternative estimating and testing empirical strategies for fractional regression models By Esmeralda A. Ramalho; Joaquim J.S. Ramalho; José M.R. Murteira
  3. Inconsistency of the QMLE and asymptotic normality of the weighted LSE for a class of conditionally heteroscedastic models. By Francq, Christian; Zakoian, Jean-Michel
  4. A GENERALIZED ASYMMETRIC STUDENT-T DISTRIBUTION WITH APPLICATION TO FINANCIAL ECONOMETRICS By John Galbraith; Dongming Zhu
  5. Concepts and tools for nonlinear time series modelling By Amendola, Alessandra; Christian, Francq
  6. Nonparametric vs parametric binary choice models: An empirical investigation By Bontemps, Christophe; Racine, Jeffrey S.; Simioni, Michel
  7. Merits and drawbacks of variance targeting in GARCH models By Francq, Christian; Horvath, Lajos; Zakoian, Jean-Michel
  8. Exact goodness-of-fit tests for censored dats By Aurea Grane
  9. OLS Estimator for a Mixed Regressive, Spatial Autoregressive Model: Extended Version By Mynbaev, Kairat
  10. Information Theoretic Estimators of the First-Order Spatial Autoregressive Model By Perevodchikov, Evgeniy V.
  11. LINEAR DISCRIMINANT RULES FOR HIGH-DIMENSIONAL CORRELATED DATA: ASYMPTOTIC AND FINITE SAMPLE RESULTS By Pedro Duarte Silva
  12. Estimating Mixed Logit Recreation Demand Models With Large Choice Sets By Domanski, Adam
  13. The Structural Estimation of Principal-Agent Models by Least Squares: Evidence from Land Tenancy in Madagascar By Brown, Zachary S.; Bellemare, Marc F.
  14. Hyper-spherical and Elliptical Stochastic Cycles By Luati, Alessandra; Proietti, Tommaso
  15. A note on maximum likelihood estimation of a Pareto mixture By Marco Bee; Roberto Benedetti; Giuseppe Espa
  16. Performance of combined double seasonal univariate time series models for forecasting water demand By Jorge Caiado
  17. A Test for the Presence of Central Bank Intervention in the Foreign Exchange Market With an Application to the Bank of Canada By Douglas James Hodgson
  18. Nonlinearity and Temporal Dependence By Xiaohong Chen; Lars P. Hansen; Marine Carrasco
  19. Identifying common dynamic features in stock returns By Jorge Caiado; Nuno Crato
  20. Support vector machines with two support vectors By Marcin Owczarczuk
  21. Forecasting with Universal Approximators and a Learning Algorithm By Anders Bredahl Kock
  22. A State Dependent Regime Switching Model of Dynamic Correlations By Tejeda, Hernan A.; Goodwin, Barry K.; Pelletier, Denis
  23. Statistical vs. Economic Significance in Economics and Econometrics: Further comments on McCloskey & Ziliak By Tom Engsted
  24. Finite State Markov-Chain Approximations to Highly Persistent Processes By Kopecky, Karen A.; Suen, Richard M. H.
  25. An Analysis of Rank Ordered Data By Paudel, Krishna P.; Poudel, Biswo N.; Dunn, Michael A.; Pandit, Mahesh
  26. THE CALIBRATION OF PROBABILISTIC ECONOMIC FORECASTS By John Galbraith; Simon van Norden
  27. Land Use Change: A Spatial Multinomial Choice Analysis By Carrion-Flores, Carmen E.; Flores-Lagunes, Alfonso; Guci, Ledia
  28. Bayesian Option Pricing Using Mixed Normal Heteroskedasticity Models By Jeroen Rombouts; Lars Peter Stentoft
  29. Detecting Mean Reversion in Real Exchange Rates from a Multiple Regime STAR Model By Frédérique Bec; Mélika Ben Salem; Marine Carrasco
  30. Sufficient Statistics for Measuring the Value of Changes in Local Public Goods: Does Chettyâs Framework Inform Lind? By Klaiber, H. Allen; Smith, V. Kerry
  31. The Almost Ideal and Translog Demand Systems By Holt, Matthew T.; Goodwin, Barry K.
  32. Bayesian Estimation of The Impacts of Food Safety Information on Household Demand for Meat and Poultry By Taylor, Mykel R.; Phaneuf, Daniel
  33. Food Safety and Spinach Demand: A Shock Correction Model By Arnade, Carlos; Calvin, Linda; Kuchler, Fred
  34. The Econometric Specification of Input Demand Systems Implied by Cost Function Representations By Keith R. McLaren; Xueyan Zhao

  1. By: Boubacar Mainassara, Yacouba; Francq, Christian
    Abstract: The asymptotic properties of the quasi-maximum likelihood estimator (QMLE) of vector autoregressive moving-average (VARMA) models are derived under the assumption that the errors are uncorrelated but not necessarily independent. Relaxing the independence assumption considerably extends the range of application of the VARMA models, and allows to cover linear representations of general nonlinear processes. Conditions are given for the consistency and asymptotic normality of the QMLE. A particular attention is given to the estimation of the asymptotic variance matrix, which may be very different from that obtained in the standard framework. Modified versions of the Wald, Lagrange Multiplier and Likelihood Ratio tests are proposed for testing linear restrictions on the parameters.
    Keywords: Echelon form; Lagrange Multiplier test; Likelihood Ratio test; Nonlinear processes; QMLE; Structural representation; VARMA models; Wald test.
    JEL: C32
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15141&r=ecm
  2. By: Esmeralda A. Ramalho (Universidade de Evora, Departamento de Economia, CEFAGE-UE); Joaquim J.S. Ramalho (Universidade de Evora, Departamento de Economia, CEFAGE-UE); José M.R. Murteira (Faculdade de Economia, Universidade de Coimbra, CEMAPRE)
    Abstract: In many economic settings, the variable of interest is often a fraction or a proportion, being defined only on the unit interval. The bounded nature of such variables and, in some cases, the possibility of nontrivial probability mass accumulating at one or both boundaries raise some interesting estimation and inference issues. In this paper we: (i) provide a comprehensive survey of the main alternative models and estimation methods suitable to deal with fractional response variables; (ii) propose a full testing methodology to assess the validity of the assumptions required by each alternative estimator; and (iii) examine the finite sample properties of most of the estimators and tests discussed through an extensive Monte Carlo study. An application concerning corporate capital structure choices is also provided.
    Keywords: Fractional regression models; Conditional mean tests; Non-nested hypotheses; Zero outcomes; Two-part models.
    JEL: C12 C13 C25 G32
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:cfe:wpcefa:2009_08&r=ecm
  3. By: Francq, Christian; Zakoian, Jean-Michel
    Abstract: This paper considers a class of finite-order autoregressive linear ARCH models. The model captures the leverage effect, allows the volatility to be zero and to reach its minimum for non-zero innovations, and is appropriate for long-memory modeling when infinite orders are allowed. It is shown that the quasi-maximum likelihood estimator is, in general, inconsistent. To solve this problem, we propose a self-weighted least-squares estimator and show that this estimator is asymptotically normal. Furthermore, a score test for conditional homoscedasticity and diagnostic portmanteau tests are developed. The latter have an asymptotic distribution which is far from the standard chi-square. Simulation experiments are carried out to assess the performance of the proposed estimator.
    Keywords: Conditional homoscedasticity testing; Inconsistent estimator; Leverage effect; Linear ARCH; Quasi-maximum likelihood; Weighted least-squares.
    JEL: C13 C22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15147&r=ecm
  4. By: John Galbraith; Dongming Zhu
    Abstract: This paper proposes a new class of asymmetric Student-t (AST) distributions, and investigates its properties, gives procedures for estimation, and indicates applications in financial econometrics. We derive analytical expressions for the cdf, quantile function, moments, and quantities useful in financial econometric applications such as the expected shortfall. A stochastic representation of the distribution is also given. Although the AST density does not satisfy the usual regularity conditions for maximum likelihood estimation, we establish consistency, asymptotic normality and efficiency of ML estimators and derive an explicit analytical expression for the asymptotic covariance matrix. A Monte Carlo study indicates generally good finite-sample conformity with these asymptotic properties.
    JEL: C13 C16
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2009-02&r=ecm
  5. By: Amendola, Alessandra; Christian, Francq
    Abstract: Tools and approaches are provided for nonlinear time series modelling in econometrics. A wide range of topics is covered, including probabilistic properties, statistical inference and computational methods. The focus is on the applications but the ideas of the mathematical arguments are also provided. Techniques and concepts are illustrated by various examples, Monte Carlo experiments and a real application.
    Keywords: Consistency and asymptotic normality; MCMC algorithms; Mixing; Nonlinear modelling; Stationarity; Time-series forecasting.
    JEL: C22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15140&r=ecm
  6. By: Bontemps, Christophe; Racine, Jeffrey S.; Simioni, Michel
    Abstract: The estimation of conditional probability distribution functions (PDFs) in a kernel nonparametric framework has recently received attention. As emphasized by Hall, Racine and Li (2004), these conditional PDFs are extremely useful for a range of tasks including modelling and predicting consumer choice. The aim of this paper is threefold. First, we implement nonparametric kernel estimation of PDF with a binary choice variable and both continuous and discrete explanatory variables. Second, we address the issue of the performances of this nonparametric estimator when compared to a classic on-the-shelf parametric estimator, namely a probit. We propose to evaluate these estimators in terms of their predictive performances, in the line of the recent "revealed performance" test proposed by Racine and Parmeter (2009). Third, we provide a detailed discussion of the results focusing on environmental insights provided by the two estimators, revealing some patterns that can only be detected using the nonparametric estimator.
    Keywords: Binary choice models, Nonparametric estimation, specification test, tap water demand, Research Methods/ Statistical Methods,
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49286&r=ecm
  7. By: Francq, Christian; Horvath, Lajos; Zakoian, Jean-Michel
    Abstract: Variance targeting estimation is a technique used to alleviate the numerical difficulties encountered in the quasi-maximum likelihood (QML) estimation of GARCH models. It relies on a reparameterization of the model and a first-step estimation of the unconditional variance. The remaining parameters are estimated by QML in a second step. This paper establishes the asymptotic distribution of the estimators obtained by this method in univariate GARCH models. Comparisons with the standard QML are provided and the merits of the variance targeting method are discussed. In particular, it is shown that when the model is misspecified, the VTE can be superior to the QMLE for long-term prediction or Value-at-Risk calculation. An empirical application based on stock market indices is proposed.
    Keywords: Consistency and Asymptotic Normality; GARCH; Heteroskedastic Time Series; Quasi Maximum Likelihood Estimation; Value-at-Risk; Variance Targeting Estimator.
    JEL: C13 C22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15143&r=ecm
  8. By: Aurea Grane
    Abstract: The statistic introduced in Fortiana and Grané (2003) is modified so that it can be used to test the goodness-of-fit of a censored sample, when the distribution function is fully specified. Exact and asymptotic distributions of three modified versions of this statistic are obtained and exact critical values are given for different sample sizes. Empirical power studies show the good performance of these statistics in detecting symmetrical alternatives.
    Keywords: Goodness-of-fit, Censored Samples, Maximum Correlation, Exact Distribution, L-statistics
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws092910&r=ecm
  9. By: Mynbaev, Kairat
    Abstract: We find the asymptotic distribution of the OLS estimator of the parameters $% \beta$ and $\rho$ in the mixed spatial model with exogenous regressors $% Y_n=X_n\beta+\rho W_nY_n+V_n$. The exogenous regressors may be bounded or growing, like polynomial trends. The assumption about the spatial matrix $W_n $ is appropriate for the situation when each economic agent is influenced by many others. The error term is a short-memory linear process. The key finding is that in general the asymptotic distribution contains both linear and quadratic forms in standard normal variables and is not normal.
    Keywords: $L_p$-approximability; mixed spatial model; OLS asymptotics
    JEL: C02 C31 C01
    Date: 2009–05–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15153&r=ecm
  10. By: Perevodchikov, Evgeniy V.
    Abstract: Information theoretic estimators for the first-order autoregressive model are considered. Extensive Monte Carlo experiments are used to compare finite sample performance of traditional and three information theoretic estimators including maximum empirical likelihood, maximum empirical exponential likelihood, and maximum log Euclidean likelihood. It is found that information theoretic estimators are robust to specification of spatial autocorrelation and dominate traditional estimators in finite samples. Finally, the proposed estimators are applied to an illustrative example of hedonic housing pricing.
    Keywords: information theoretic estimators, the first-order spatial autoregressive model, Research Methods/ Statistical Methods,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49491&r=ecm
  11. By: Pedro Duarte Silva (Faculdade de Economia e Gestão, Universidade Católica Portuguesa (Porto))
    Abstract: A new class of linear discrimination rules, designed for problems with many correlated variables, is proposed. This proposal tries to incorporate the most important patterns revealed by the empirical correlations and accurately approximate the optimal Bayes rule as the number of variables increases. In order to achieve this goal, the new rules rely on covariance matrix estimates derived from Gaussian factor models with small intrinsic dimensionality. Asymptotic results, based on a analysis that allows the number of variables to grow faster than the number of observations, show that the worst possible expected error rate of the proposed rules converges to the error of the optimal Bayes rule when the postulated model is true, and to a slightly larger constant when this model is a close approximation to the data generating process. Simulation results suggest that, in the data conditions they were designed for, the new rules can clearly outperform both Fisher's and naive linear discriminant rules.
    Keywords: Discriminant Analysis, High Dimensionality, Expected Misclassification Rate, Min-Max Regret
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:cap:mpaper:092009&r=ecm
  12. By: Domanski, Adam
    Abstract: Discrete choice models are widely used in studies of recreation demand. They have proven valuable when modeling situations where decision makers face large choice sets and site substitution is important. However, when the choice set faced by the individual becomes very large (on the order of hundreds or thousands of alternatives), computational limitations make estimation with the full choice set intractable. Sampling of alternatives in a conditional logit framework is an effective method to limit computational burdens while still producing consistent estimates. This method is allowed by the existence of the independence of irrelevant alternatives (IIA) assumption. More advanced mixed logit models account for unobserved preference heterogeneity and overcome the behavioral limitations of the IIA assumption, however in doing so, prohibit sampling of alternatives. A method is developed where a latent class (finite mixture) model is estimated via the expectations-maximization algorithm and in doing so, allows consistent sampling of alternatives in a mixed logit model. The method is tested and applied to a recreational demand Wisconsin fishing survey.
    Keywords: Sampling of alternatives, discrete choice, mixed logit, conditional logit, recreational demand, Wisconsin, fishing, microeconometrics, Environmental Economics and Policy, Research Methods/ Statistical Methods,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49413&r=ecm
  13. By: Brown, Zachary S.; Bellemare, Marc F.
    Abstract: We develop a method to structurally estimate principal-agent mod- els by ordinary least squares (OLS). We set up a general principal- agent model which explicitly incorporates the wealth levels of each party and the opportunity cost to the agent of entering the contract. This yields an optimal contract that is linearized by way of an Nth order Taylor approximation. This in turn imposes N(3N-1)/2 restric- tions on the parameters and yields an empirical test of the canonical principal-agent model. In the application, we consider the case where N = 2 and apply our method to a sample of land tenancy contracts in rural Madagascar. Empirical tests lead to consistent failure to reject the hypotheses derived from our structural model, which lends support to our structural method as well as to the canonical principal-agent model.
    Keywords: Principal-Agent Models, Contract Theory, Structural Estimations, Risk and Uncertainty, C12, C13, D86, O12, Q12,
    Date: 2009–04–30
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49368&r=ecm
  14. By: Luati, Alessandra; Proietti, Tommaso
    Abstract: A univariate first order stochastic cycle can be represented as an element of a bivariate first order vector autoregressive process, or VAR(1), where the transition matrix is associated with a Givens rotation. From the geometrical viewpoint, the kernel of the cyclical dynamics is described by a clockwise rotation along a circle in the plane. The reduced form of the cycle is either ARMA(2,1), with complex roots, or AR(1), when the rotation angle equals 2k\pi or (2k + 1)\pi; k = 0; 1;... This paper generalizes this representation in two directions. According to the first, the cyclical dynamics originate from the motion of a point along an ellipse. The reduced form is also ARMA(2,1), but the model can account for certain types of asymmetries. The second deals with the multivariate case: the cyclical dynamics result from the projection along one of the coordinate axis of a point moving in Rn along an hyper-sphere. This is described by a VAR(1) process whose transition matrix is obtained by a sequence of n-dimensional Givens rotations. The reduced form of an element of the system is shown to be ARMA(n, n - 1). The properties of the resulting models are analyzed in the frequency domain, and we show that this generalization can account for a multimodal spectral density. The illustrations show that the proposed generalizations can be fitted successfully to some well known case studies of the econometric and time series literature. For instance, the elliptical model provides a parsimonious but effective representation of the mink-muskrat interaction. The hyperspherical model provides an interesting re-interpretation of the cycle in US Gross Domestic Product quarterly growth and the cycle in the Fortaleza rainfall series.
    Keywords: State space models; Predator-Prey Interaction; Givens Rotations.
    JEL: C32 E32 C22
    Date: 2009–05–11
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15169&r=ecm
  15. By: Marco Bee; Roberto Benedetti; Giuseppe Espa
    Abstract: In this paper we study Maximum Likelihood Estimation of the parameters of a Pareto mixture. Application of standard techniques to a mixture of Pareto is problematic. For this reason we develop two alternative algorithms. The first one is the Simulated Annealing and the second one is based on Cross-Entropy minimization. The Pareto distribution is a commonly used model for heavy-tailed data. It is a two-parameter distribution whose shape parameter determines the degree of heaviness of the tail, so that it can be adapted to data with different features. This work is motivated by an application in the operational risk measurement field: we fit a Pareto mixture to operational losses recorded by a bank in two different business lines. Losses below an unknown threshold are discarded, so that the observed data are truncated. The thresholds used in the two business lines are unknown. Thus, under the assumption that each population follows a Pareto distribution, the appropriate model is a mixture of Pareto where all the parameters have to be estimated.
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:trn:utwpde:0903&r=ecm
  16. By: Jorge Caiado (CEMAPRE, School of Economics and Management (ISEG), Technical University of Lisbon)
    Abstract: In this article, we examine the daily water demand forecasting performance of double seasonal univariate time series models (Holt-Winters, ARIMA and GARCH) based on multi-step ahead forecast mean squared errors. A within-week seasonal cycle and a within-year seasonal cycle are accommodated in the various model specifications to capture both seasonalities. We investigate whether combining forecasts from different methods for different origins and horizons could improve forecast accuracy. The analysis is made with daily data for water consumption in Granada, Spain.
    Keywords: ARIMA, Combined forecasts, Double seasonality, Exponential Smoothing, Forecasting, GARCH, Water demand
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:cma:wpaper:0903&r=ecm
  17. By: Douglas James Hodgson
    Abstract: We propose a general non-linear simultaneous equations framework for the econometric analysis of models of intervention in foreign exchange markets by central banks in response to deviations of exchange rates from target levels. We consider the instrumental variables estimation of possibly non-linear response functions and tests of intervention when the functional form may be non-linear, asymmetric, and may contain unknown shape parameters. The methodology applies techniques developed for testing in the presence of nuisance parameters unidentified under a null hypothesis to a nonlinear simultaneous equations model. We report the results of an empirical analysis of activity of the Bank of Canada, for the period from 1953-2006, with regard to the Canada-U.S. exchange rate, with changes in foreign reserves proxying for intervention activity. <P>Nous proposons un cadre de référence général pour les équations non-linéaires simultanées s’appliquant à l’analyse économétrique de modèles d’intervention des banques centrales dans les marchés des devises étrangères, en réponse aux écarts des taux de change par rapport aux niveaux cibles. Nous prenons en considération l’estimation des variables instrumentales liées aux fonctions de réponses possiblement non-linéaires et aux tests en matière d’interventions lorsque la forme fonctionnelle peut être non linéaire, asymétrique et lorsqu’elle peut contenir des paramètres de forme inconnue. La méthodologie applique, à un modèle à équations simultanées non linéaires, des techniques élaborées pour effectuer des tests en présence de paramètres de nuisance non identifiés sous une hypothèse nulle. Nous présentons les résultats d’une analyse empirique des activités de la Banque du Canada, durant la période de 1953-2006, relativement au taux de change Canada-É.-U., les variations des réserves étrangères permettant les activités d’intervention.
    Keywords: unidentified nuisance parameter, nonlinear simultaneous equations, foreign exchange reserves, policy reaction functions, paramètre de nuisance non identifié, équations simultanées non linéaires, réserves de change, fonctions de réaction de la politique
    Date: 2009–04–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-14&r=ecm
  18. By: Xiaohong Chen; Lars P. Hansen; Marine Carrasco
    Abstract: Nonlinearities in the drift and diffusion coefficients influence temporal dependence in scalar diffusion models. We study this link using two notions of temporal dependence: β−mixing and ρ−mixing. Weshow that β−mixing and ρ−mixing with exponential decay are essentially equivalent concepts for scalar diffusions. For stationary diffusions that fail to be ρ−mixing, we show that they are still β−mixing except that the decay rates are slower than exponential. For such processes we find transformations of the Markov states that have finite variances but infinite spectral densities at frequency zero. Some have spectral densities that diverge at frequency zero in a manner similar to that of stochastic processes with long memory. Finally we show how nonlinear, state-dependent, Poisson sampling alters the unconditional distribution as well as the temporal dependence. <P>Les non-linéarités dans les coefficients de mouvement et de diffusion ont une incidence sur la dépendance temporelle dans le cas des modèles de diffusion scalaire. Nous examinons ce lien en recourant à deux notions de dépendance temporelle : mélange β et mélange ρ. Nous démontrons que le mélange β et le mélange ρ avec dégradation exponentielle constituent des concepts fondamentalement équivalents en ce qui a trait aux diffusions scalaires. Pour ce qui est des diffusions stationnaires qui ne se classent pas dans le mélange ρ, nous démontrons qu’elles appartiennent quand même au mélange β, sauf que les taux de dégradation sont lents plutôt qu’exponentiels. Pour ce genre de processus, nous recourons à des transformations des états de Markov dont les variations sont finies, mais dont les densités spectrales sont infinies à la fréquence zéro. Certains états ont des densités spectrales qui divergent à la fréquence zéro de la même façon que dans le cas des processus stochastiques à mémoire longue. En terminant, nous indiquons la façon dont l’échantillonnage de Poisson qui est non linéaire et dépendant de l’état modifie la distribution inconditionnelle et la dépendance temporelle.
    Keywords: Mixing, Diffusion, Strong dependence, Long memory, Poisson sampling., mélange, diffusion, forte dépendance, mémoire longue, échantillonnage de Poisson.
    JEL: C12 C13 C22 C50
    Date: 2009–05–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-17&r=ecm
  19. By: Jorge Caiado (CEMAPRE, School of Economics and Management (ISEG), Technical University of Lisbon); Nuno Crato (CEMAPRE, School of Economics and Management (ISEG), Technical University of Lisbon)
    Abstract: This paper proposes volatility and spectral based methods for cluster analysis of stock returns. Using the information about both the estimated parameters in the threshold GARCH (or TGARCH) equation and the periodogram of the squared returns, we compute a distance matrix for the stock returns. Clusters are formed by looking to the hierarchical structure tree (or dendrogram) and the computed principal coordinates. We employ these techniques to investigate the similarities and dissimilarities between the "blue-chip" stocks used to compute the Dow Jones Industrial Average (DJIA) index.
    Keywords: Asymmetric effects, Cluster analysis, DJIA stock returns, Periodogram, Threshold GARCH model, Volatility
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:cma:wpaper:0902&r=ecm
  20. By: Marcin Owczarczuk (Department of Applied Econometrics, Warsaw School of Economics)
    Abstract: In this article we present a new class of support vector machines for binary classification task. Our support vector machines are constructed using only two support vectors and have very low Vapnik-Chervonenkis dimension, so they generalize well. Geometrically, our approach is based on searching of a proper pair of observations from different classes of explained variable. Once this pair is found the discriminant hyperplane becomes orthogonal to the line connecting these observations. This method deals well with data sets with large number of features and small number of observations like gene expression data. We illustrate the performance of our classification method using gene expression data and show that it is superior to other classifiers especially to diagonal linear discriminant analysis and k-nearest neighbor which achieved the lowest error rate in the previous studies of tumor classification.
    Keywords: support vector machines, Vapnik-Chervonenkis dimension, microarray experiment, tumor classification.
    Date: 2009–05–01
    URL: http://d.repec.org/n?u=RePEc:wse:wpaper:35&r=ecm
  21. By: Anders Bredahl Kock (Aarhus University and CREATES)
    Abstract: This paper applies three universal approximators for forecasting. They are the Artificial Neural Networks, the Kolmogorov- Gabor polynomials, as well as the Elliptic Basis Function Networks. Even though forecast combination has a long history in econometrics focus has not been on proving loss bounds for the combination rules applied. We apply the Weighted Average Algorithm (WAA) of Kivinen and Warmuth (1999) for which such loss bounds exist. Specifically, one can bound the worst case performance of the WAA compared to the performance of the best single model in the set of models combined from. The use of universal approximators along with a combination scheme for which explicit loss bounds exist should give a solid theoretical foundation to the way the forecasts are performed. The practical performance will be investigated by considering various monthly postwar macroeconomic data sets for the G7 as well as the Scandinavian countries.
    Keywords: Forecasting, Universal Approximators, Elliptic Basis Function Network, Forecast Combination, Weighted Average Algorithm
    JEL: C22 C45 C53
    Date: 2009–05–11
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-18&r=ecm
  22. By: Tejeda, Hernan A.; Goodwin, Barry K.; Pelletier, Denis
    Abstract: We extend the Regime Switching for Dynamic Correlations (RSDC) model by Pelletier (Journal of Econometrics, 2006), to determine the effect of underlying fundamental variables in the evolution of the dynamic correlations between multiple time series. By introducing state dependent transition probabilities to the switching process between different regimes - governed by a Markov chain, we are able to identify potential thresholds and spillover effects in the dynamic process. In addition, asymmetric correlations between the series are determined. We simulate data for multiple series and find an initial better fit of state dependent transition probabilities, versus constant transition probabilities, for the regime switching model. Capturing more precisely the dynamic interrelationships between multiple series or markets conveys many benefits including - potential efficiency gains from related operations, determining the effects of shocks from related variables, as well as improvement in hedging operations.
    Keywords: dynamic correlations, regime switching, state dependent probabilities, thresholds, spillovers, Research Methods/ Statistical Methods,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49370&r=ecm
  23. By: Tom Engsted (School of Economics and Management, University of Aarhus and CREATES)
    Abstract: I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing ’real error’, i.e. non-sampling error that cannot be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model evaluation. In these areas models are treated as inherently misspecified and, consequently, are evaluated empirically by other methods than statistical tests. I also criticise McCloskey & Ziliak for their strong focus on the size of parameter estimates while neglecting the important question of how to obtain reliable estimates, and I argue that significance tests are useful tools in those cases where a statistical model serves as input in the quantification of an economic model. Finally, I provide a specific example from economics - asset return predictability - where the distinction between statistical and economic significance is well appreciated, but which also shows how statistical tests have contributed to our substantive economic understanding.
    Keywords: Statistical and economic significance, statistical hypothesis testing, model evaluation, misspecified models
    JEL: B41 C10 C12
    Date: 2009–05–04
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-17&r=ecm
  24. By: Kopecky, Karen A.; Suen, Richard M. H.
    Abstract: This paper re-examines the Rouwenhorst method of approximating first-order autoregressive processes. This method is appealing because it can match the conditional and unconditional mean, the conditional and unconditional variance and the first-order autocorrelation of any AR(1) process. This paper provides the first formal proof of this and other results. When comparing to five other methods, the Rouwenhorst method has the best performance in approximating the business cycle moments generated by the stochastic growth model. It is shown that, equipped with the Rouwenhorst method, an alternative approach to generating these moments has a higher degree of accuracy than the simulation method.
    Keywords: Numerical Methods; Finite State Approximations; Optimal Growth Model
    JEL: C63
    Date: 2009–05–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15122&r=ecm
  25. By: Paudel, Krishna P.; Poudel, Biswo N.; Dunn, Michael A.; Pandit, Mahesh
    Abstract: Many methods are available to analyze rank ordered data. We used a spectral density method to analyze Formosan subterranean termite control options ranked by Louisiana homeowners. Respondents are asked to rank termite control options from the most preferred to the least preferred option. Spectral analysis results indicated that the most preferred termite control choice is a relatively cheap ($0.13 per square foot) option of liquid treatment.
    Keywords: FST, rank ordered data, spectral analysis, Research Methods/ Statistical Methods,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49518&r=ecm
  26. By: John Galbraith; Simon van Norden
    Abstract: A probabilistic forecast is the estimated probability with which a future event will satisfy a particular criterion. One interesting feature of such forecasts is their calibration, or the match between predicted probabilities and actual outcome probabilities. Calibration has been evaluated in the past by gropuing probability forecasts into discrete categories. Here we show that we can do so without discrete groupings; the kernel estimators that we use produce efficiency gains and smooth estimated curves relating predicted and actual probabilities. We use such estimates to evaluate the empirical evidence on calibration error in a number of economic applications including recession and inflation prediction, using both forecasts made and stored in real time and pseudo-forecasts made using the data vintage available at the forecast date. We evaluate outcomes using both first-release outcome measures as well as later, thoroughly revised data. We find strong evidence of incorrect calibration in professional forecasts of recessions and inflation. We also present evidence of asymmetries in the performace of inflation forecasts based on real-time output gaps.
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2008-05&r=ecm
  27. By: Carrion-Flores, Carmen E.; Flores-Lagunes, Alfonso; Guci, Ledia
    Abstract: Urban decentralization and dispersion trends have led to increased conversion of rural lands in many urban peripheries and exurban regions of the U.S. The growth of the exurban areas has outpaced growth in urban and suburban areas, resulting in growth pressures at the urban-rural fringe. A thorough analysis of land use change patterns and the ability to predict these changes are necessary for the effective design of regional environmental, growth, and development policies. We estimate a multinomial discrete choice model with spatial dependence using parcel-level data from Medina County, Ohio. Accounting for spatial dependence should result in improved statistical inference about land use changes. Our spatial model extends the binary choice âlinearized logitâ model of Klier and McMillen (2008) to a multinomial setting. A small Monte Carlo simulation indicates that this estimator performs reasonably well. Preliminary results suggest that the location of new urban development is guided by a preference over lower density areas, yet in proximity to current urban development. In addition, we find significant evidence of spatial dependence in land use decisions.
    Keywords: Land Use Change, Multinomial Logit, Spatial Dependence, Community/Rural/Urban Development, Land Economics/Use, Research Methods/ Statistical Methods, R14, C21, C25,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49403&r=ecm
  28. By: Jeroen Rombouts; Lars Peter Stentoft
    Abstract: While stochastic volatility models improve on the option pricing error when compared to the Black-Scholes-Merton model, mispricings remain. This paper uses mixed normal heteroskedasticity models to price options. Our model allows for significant negative skewness and time varying higher order moments of the risk neutral distribution. Parameter inference using Gibbs sampling is explained and we detail how to compute risk neutral predictive densities taking into account parameter uncertainty. When forecasting out-of-sample options on the S&P 500 index, substantial improvements are found compared to a benchmark model in terms of dollar losses and the ability to explain the smirk in implied volatilities. <P>Les modèles à volatilité stochastique apportent des améliorations en ce qui a trait à l’erreur d’établissement des prix des options comparativement au modèle de Black-Scholes-Merton. Toutefois, la fixation incorrecte des prix persiste. Le présent document a recours à des modèles mixtes avec hétéroscédasticité normale pour fixer les prix des options. Notre modèle permet de tenir compte de l’asymétrie négative importante et des moments d’ordre élevé variant dans le temps liés à la distribution du risque nul. Nous expliquons l’inférence des paramètres selon l’échantillonnage de Gibbs et détaillons la façon de traiter les densités prédictives de risque neutre en prenant en considération l’incertitude des paramètres. Dans le cas des prévisions concernant les options hors-échantillonnage sur l’indice S&P 500, nous constatons des améliorations importantes, par rapport à un modèle de référence, en termes de pertes exprimées en dollars et de capacité d’expliquer l’ironie des volatilités implicites.
    Keywords: Bayesian inference, option pricing, finite mixture models, out-of-sample prediction, GARCH models, Inférence bayésienne, fixation du prix des options, modèles à mélanges finis, prédiction hors-échantillon, modèles GARCH.
    JEL: C11 C15 C22 G13
    Date: 2009–05–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-19&r=ecm
  29. By: Frédérique Bec; Mélika Ben Salem; Marine Carrasco
    Abstract: Recent studies on general equilibrium models with transaction costs show that the dynamics of the real exchange rate are necessarily nonlinear. Our contribution to the literature on nonlinear price adjustment mechanisms is treefold. First, we model the real exchange rate by a Multi-Regime Logistic Smooth Transition AutoRegression (MR-LSTAR), allowing for both ESTAR-type and SETAR-type dynmaics. This choice is motivated by the fact that even the theoretical models, which predict a smooth behavior for the real exchange rate, do not rule out the possibility of a discontinuous adjustment as a limit case. Second, we propose two classes of unit-root tests against this MR-LSTAR alternative, based respectively on the likelihood and on a auxiliary model. Their asymptotic distributions are derived analytically. Third, when applied to 28 bilateral real exchange rates, our tests reject the null hypothesis of a unit root for eleven series bringing evidence in favor of the purchasing power parity. <P>Des études récentes sur les modèles d’équilibre général prenant en considération les coûts des transactions démontrent que la dynamique du taux de change réel est nécessairement non linéaire. Notre contribution à la littérature portant sur les mécanismes d’ajustement non linéaire des prix comporte trois volets. Premièrement, nous modélisons le taux de change réel en recourant à une autorégression de type MR-LSTAR (Multi-Regime Logistic Smooth Transition AutoRegression), qui permet d’observer la dynamique des modèles ESTAR (Exponential Smooth TAR) et SETAR (Self-Exciting Treshold Autoregressive). Notre choix est motivé par le fait que même les modèles théoriques, qui prédisent un comportement lisse du taux de change réel, n’excluent pas la possibilité d’un ajustement discontinu à la limite. Deuxièmement, nous proposons deux catégories de tests de racine unitaire, dans le cadre de l’option MR-LSTAR, fondées respectivement sur la vraisemblance et sur un modèle auxiliaire. Leurs distributions asymptotiques résultent d’un processus analytique. Troisièmement, lorsque nos tests sont appliqués à 28 taux de change réels bilatéraux, ils rejettent l’hypothèse nulle d’une racine unitaire dans le cas de onze séries, faisant ainsi la preuve de la parité du pouvoir d’achat.
    Keywords: Half-life, purchasing power parity, mixing conditions, smooth transition autoregressive model, unit-root test, real exchange rate, Demi-vie, parité du pouvoir d’achat, conditions de mélange, modèle autorégressif à transition lisse, test d’unité racinaire, taux de change réel
    JEL: C12 C22 F31
    Date: 2009–05–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-18&r=ecm
  30. By: Klaiber, H. Allen; Smith, V. Kerry
    Abstract: The performance of quasi-experimental methods applied to changes in non-market goods depends on the ability of reduced form models to accurately measure willingness to pay. When exogenous changes are non-marginal, the accuracy of the reduced form approximations is not well understood. Further complicating the performance of reduced form models is that the true representation of the non-market good in household utility functions may differ from the perceptions of that good as captured in the reduced form model. This paper evaluates a series of before/after quasi-experiments where the true model is known and examines the performance of these methods under a variety of conditions. We find that performance is impacted by the scale of the change and that differences in perceptions of the amenity between the reduced form model and the underlying utility function play an important role in determining the performance of quasi-experimental applications. For researchers interested in non-market goods where the true representations of changes in relation to the underlying utility function are unknown, the notion of perceived measures of the non-market good in reduced form models should receive considerable attention.
    Keywords: Welfare Measurement, Quasi-Experiment, Assignment Model, Perceptions, Non-Marginal Change, Open Space, Resource /Energy Economics and Policy,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49596&r=ecm
  31. By: Holt, Matthew T.; Goodwin, Barry K.
    Abstract: This chapter reviews the specification and application of the Deaton and Muellbauer (1980) Almost Ideal Demand System (AIDS) and the Christensen, Jorgensen, and Lau (1975) tranlog (TL) demand system. In so doing we examine various refinements to these models, including ways of incorporating demographic effects, methods by which curvature conditions can be imposed, and issues associated with incorporating structural change and seasonal effects. We also review methods for adjusting for autocorrelation in the model's residuals. A set of empirical examples for the AIDS and a the log TL version of the translog based on historical meat price and consumption data for the United States are also presented.
    Keywords: Almost ideal demand system; Autocorrelation; Curvature; Meat Demand; Translog
    JEL: C32 D12 Q11
    Date: 2009–03–14
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:15092&r=ecm
  32. By: Taylor, Mykel R.; Phaneuf, Daniel
    Abstract: Consumer reaction to changes in the amount of food safety information on beef, pork, and poultry available in the media is the focus of this study. Specifically, any differences in consumer reactions due to heterogeneous household characteristics are investigated. The data used in this study are monthly data from the Nielsen Homescan panel and cover the time period January 1998 to December 2005. These panel data contain information on household purchases of fresh meat and poultry as well as demographic characteristics of the participating households. The data used to describe food safety information were obtained from searches of newspapers using the Lexis-Nexis academic search engine. Consumer reactions are modeled in this study using a demand system that allows for both discrete and continuous choice situations. A seemingly unrelated regression (SUR) tobit model is estimated using a Gibbs sampler with data augmentation. A component error structure (random effects model) is incorporated into the SUR tobit model to account for unobserved heterogeneity of households making repeated purchases over time. Estimates of food safety elasticities calculated from the random effects SUR tobit model suggest that food safety information does not have a statistically or economically significant effect on household purchases of meat and poultry.
    Keywords: food safety, panel data, Gibbs sampler, component error, Agricultural and Food Policy, Consumer/Household Economics, Food Consumption/Nutrition/Food Safety,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49214&r=ecm
  33. By: Arnade, Carlos; Calvin, Linda; Kuchler, Fred
    Abstract: This paper generalizes the standard error correction model and applies this more general modeling procedure to an analysis of the spinach e-coli outbreak on consumer demand.
    Keywords: E-Coli, Error Correction Model, Spinach, Consumer/Household Economics, Demand and Price Analysis, Food Consumption/Nutrition/Food Safety,
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ags:aaea09:49208&r=ecm
  34. By: Keith R. McLaren; Xueyan Zhao
    Abstract: In the case of input demand systems based on specification of technology by a Translog cost function, it is common to estimate either a system of share equations alone, or to supplement them by the cost function. By adding up, one of the share equations is excluded. In this paper it is argued that a system of n-1 share equations is essentially incomplete, whereas if the n-1 share equations are supplemented by the cost function the implied error structure is inadmissible. Similarly, if the technology is specified by a normalized quadratic cost function, it is common to estimate either a system of n-1 demand equations alone, or to supplement them by the cost function. In both cases, the implied error structure is again inadmissible.
    Keywords: Cost Function; Input demands; Share equations; Translog; Normalized Quadratic; Error specification.
    JEL: C30 D24
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2009-3&r=ecm

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.