nep-ecm New Economics Papers
on Econometrics
Issue of 2018‒12‒24
thirty-two papers chosen by
Sune Karlsson
Örebro universitet

  1. Testing explosive bubbles with time-varying volatility By David Harvey; Stephen Leybourne; Yang Zu
  2. Treatment Effects with Multiple Outcomes By John Mullahy
  3. Regime switching panel data models with interative fixed effects By Tingting Cheng; Jiti Gao; Yayi Yan
  4. Estimating the wrapped stable distribution via indirect inference By Marco Bee
  5. Optimal Pseudo-Gaussian and Rank-Based Random Coefficient Detection in Multiple Regression By Abdelhadi Akharif; Mohamed Fihri; Marc Hallin; Amal Mellouk
  6. HAR Testing for Spurious Regression in Trend By Peter C.B. Phillips; Yonghui Zhang; Xiaohu Wang
  7. Time-varying spectral analysis: Theory and applications By D.M. Nachane
  8. A review of more than one hundred Pareto-tail index estimators By Fedotenkov, Igor
  9. Estimating Value-at-Risk for the g-and-h distribution: an indirect inference approach. By Marco Bee; Julien Hambuckers; Luca Trapin
  10. The influence of renewables on electricity price forecasting: a robust approach By Luigi Grossi; Fany Nan
  11. Testing Fractional Unit Roots with Non-linear Smooth Break Approximations using Fourier functions By Gil-Alana, Luis A.; Yaya, OlaOluwa S
  12. Dynamic Panel Modeling of Climate Change By Peter C.B. Phillips
  13. Testing for strict stationarity in a random coefficient autoregressive model By Lorenzo Trapani
  14. Size matters: Estimation sample length and electricity price forecasting accuracy By Carlo Fezzi; Luca Mosetti
  15. On the estimation of behavioral macroeconomic models via simulated maximum likelihood By Kukacka, Jiri; Jang, Tae-Seok; Sacht, Stephen
  16. Panel Bayesian VAR Modeling for Policy and Forecasting when dealing with confounding and latent effects By Antonio Pacifico
  17. A Frequency-Domain Approach to Dynamic Macroeconomic Models By Tan, Fei
  18. Threshold regression with endogeneity for short panels By Tue Gorgens; Allan H. Würtz
  19. FFORMA: Feature-based forecast model averaging By Pablo Montero-Manso; George Athanasopoulos; Rob J Hyndman; Thiyanga S Talagala
  20. Regime switching in the presence of endogeneity By Tingting Cheng; Jiti Gao; Yayi Yan
  21. Testing for randomness in a random coefficient autoregression model By Lajos Horvath; Lorenzo Trapani
  22. Local Linear Dependence Measure for Functionally Correlated Variables By Loann D. Desboulets; Costin Protopopescu
  23. Counterfactual Analysis Using Censored Duration Data By García Suaza, Andrés Felipe; Delgado González, Miguel Ángel
  24. Fundamentalness, Granger Causality and Aggregation By Mario Forni; Luca Gambetti; Luca Sala
  25. Improved Inference on the Rank of a Matrix By Qihui Chen; Zheng Fang
  26. Shackling the Identification Police? By Christopher J. Ruhm
  27. An Information-Theoretic Approach to Estimating Willingness To Pay for River Recreation Site Attributes By Henry, Miguel; Mittelhammer, Ron; Loomis, John
  28. Identifying the Effect of Persuasion By Sung Jae Jun; Sokbae Lee
  29. Co-movements in Market Prices and Fundamentals: A Semiparametric Multivariate GARCH Approach By Loann D. Desboulets
  30. A Review on Variable Selection in Regression Analysis By Loann D. Desboulets
  31. Using published bid/ask curves to error dress spot electricity price forecasts By Gunnhildur H. Steinbakk; Alex Lenkoski; Ragnar Bang Huseby; Anders L{\o}land; Tor Arne {\O}ig{\aa}rd
  32. Bias of OLS Estimators due to Exclusion of Relevant Variables and Inclusion of Irrelevant Variables By Deepankar Basu

  1. By: David Harvey; Stephen Leybourne; Yang Zu
    Abstract: This paper considers the problem of testing for an explosive bubble in financial data in the presence of time-varying volatility. We propose a weighted least squares-based variant of the Phillips, Wu and Yu (2011) test for explosive autoregressive behaviour. We find that such an approach has appealing asymptotic power properties, with the potential to deliver substantially greater power than the established OLS-based approach for many volatility and bubble settings. Given that the OLS-based test can outperform the weighted least squares-based test for other volatility and bubble specifications, we also suggested a union of rejections procedure that succeeds in capturing the better power available from the two constituent tests for a given alternative. Our approach involves a nonparametric kernel-based volatility function estimator for computation of the weighted least squares-based statistic, together with the use of a wild bootstrap procedure applied jointly to both individual tests, delivering a opowerful testing procedure that is asymptotically size-robust to a wide range of time-varying volatility specifications.
    Keywords: Rational bubble; Explosive autoregression; Time-varying volatility; Weighted least squares; Right-tailed unit root testing.
    URL: http://d.repec.org/n?u=RePEc:not:notgts:18/05&r=ecm
  2. By: John Mullahy
    Abstract: This paper proposes strategies for defining, identifying, and estimating features of treatment-effect distributions in contexts where multiple outcomes are of interest. After describing existing empirical approaches used in such settings, the paper develops a notion of treatment preference that is shown to be a feature of standard treatment-effect analysis in the single-outcome case. Focusing largely on binary outcomes, treatment-preference probability treatment effects (PTEs) are defined and are seen to correspond to familiar average treatment effects in the single-outcome case. The paper suggests seven possible characterizations of treatment preference appropriate to multiple-outcome contexts. Under standard assumptions about unconfoundedness of treatment assignment, the PTEs are shown to be point identified for three of the seven characterizations and set identified for the other four. Probability bounds are derived and empirical approaches to estimating the bounds—or the PTEs themselves in the point-identified cases—are suggested. These empirical approaches are straightforward, involving in most instances little more than estimation of binary-outcome probability models of what are commonly known as composite outcomes. The results are illustrated with simulated data and in analyses of two microdata samples. Finally, the main results are extended to situations where the component outcomes are ordered or categorical.
    JEL: C18 D04 I1
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:25307&r=ecm
  3. By: Tingting Cheng; Jiti Gao; Yayi Yan
    Abstract: In this paper, we introduce a regime switching panel data model with interactive fixed effects. We propose a maximum likelihood estimation method and develop an expectation and conditional maximization algorithm to estimate the unknown parameters. Simulation results show that the algorithm works well in finite samples. The biases of the maximum likelihood estimators are negligible and the root mean squared errors of the maximum likelihood estimators decrease with the increase of either cross-sectional units N or time periods T.
    Keywords: ECM algorithm, interactive effect, maximum likelihood estimation, panel data model, regime switching.
    JEL: C23 C32
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2018-21&r=ecm
  4. By: Marco Bee
    Abstract: The wrapped stable distribution is a model for non-symmetric circular data. Maximum likelihood estimation is feasible, but computationally expensive and not exact, because the density does not exist in closed form. In light of these difficulties, we develop a constrained indirect inference approach based on a skewed-t auxiliary model. To improve the finite-sample properties of the estimators, we devise a bootstrap-based estimate of the weighting matrix employed in the indirect inference program. The simulation study suggests that the indirect inference estimators are definitely preferable to the maximum likelihood estimators as concerns computing time, and approximately equivalent in terms of root-mean-squared-error.
    Keywords: Indirect inference; directional statistics; stable distribution; weighting matrix
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:trn:utwprg:2018/11&r=ecm
  5. By: Abdelhadi Akharif; Mohamed Fihri; Marc Hallin; Amal Mellouk
    Abstract: Random coefficient regression (RCR) models are the regression versions of random effects models in analysis of variance and panel data analysis. Optimal detection of the presence of random coefficients (equivalently, optimal testing of the hypothesis of constant regression coefficients) has been an open problem for many years. The simple regression case has been solved recently (Fihri et al. (2017)), and the multiple regression case is considered here. This problem poses several theoretical challenges (a)a nonstandard ULAN structure, with log-likelihood gradients vanishing at the null hypothesis; (b) a cone-shaped alternative under which traditional maximin-type optimality concepts are no longer adequate; (c) a matrix of nuisance parameters (the correlation structure of the random coefficients) that are not identified under the null but have a very significant impact on local powers. Inspired by Novikov (2011), we propose a new (local and asymptotic) concept of optimality for this problem, and, for specified error densities, derive the corresponding parametrically optimal procedures.A suitable modification of the Gaussian version of the latter is shown to remain valid under arbitrary densities with finite moments of order four, hence qualifies as a pseudo-Gaussian test. The asymptotic performances of those pseudo-Gaussian tests, however, are rather poor under skewed and heavy-tailed densities. We therefore also construct rank-based tests, possibly based on data-driven scores, the asymptotic relative efficiencies of which are remarkably high with respect to their pseudo-Gaussian counterparts.
    Keywords: Random Coefficient; Multiple RegressionModel; Local Asymptotic Normality; Pseudo-Gaussian Test; Aligned Rank Test; Cone Alternative
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/279634&r=ecm
  6. By: Peter C.B. Phillips (Cowles Foundation, Yale University); Yonghui Zhang (Renmin University of China); Xiaohu Wang (The Chinese University of Hong Kong)
    Abstract: The usual t test, the t test based on heteroskedasticity and autocorrelation consistent (HAC) covariance matrix estimators, and the heteroskedasticity and autocorrelation robust (HAR) test are three statistics that are widely used in applied econometric work. The use of these significance tests in trend regression is of particular interest given the potential for spurious relationships in trend formulations. Following a longstanding tradition in the spurious regression literature, this paper investigates the asymptotic and finite sample properties of these test statistics in several spurious regression contexts, including regression of stochastic trends on time polynomials and regressions among independent random walks. Concordant with existing theory (Phillips, 1986, 1998; Sun, 2004, 2014), the usual t test and HAC standardized test fail to control size as the sample size n \to \infty in these spurious formulations, whereas HAR tests converge to well-defined limit distributions in each case and therefore have the capacity to be consistent and control size. However, it is shown that when the number of trend regressors K \to \infty, all three statistics, including the HAR test, diverge and fail to control size as n \to \infty. These findings are relevant to high dimensional nonstationary time series regressions.
    Keywords: HAR inference, Karhunen-Loève representation, Spurious regression, t-statistics
    JEL: C12 C14 C23
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2153&r=ecm
  7. By: D.M. Nachane (Indira Gandhi Institute of Development Research)
    Abstract: Non-stationary time series are a frequently observed phenomenon in several applied fields, particularly physics, engineering and economics. The conventional way of analysing such series has been via stationarity inducing filters. This can interfere with the intrinsic features of the series and induce distortions in the spectrum. To avert this possibility, it might be a better alternative to proceed directly with the series via the so-called time-varying spectrum. This article outlines the circumstances under which such an approach is possible, drawing attention to the practical applicability of these methods. Several methods are discussed and their relative advantages and drawbacks delineated.
    Keywords: Non-stationarity, mixing conditions, oscillatory processes, evolutionary spectrum, ANOVA, decoupling
    JEL: C32
    URL: http://d.repec.org/n?u=RePEc:ind:igiwpp:2018-025&r=ecm
  8. By: Fedotenkov, Igor
    Abstract: This paper reviews more than one hundred Pareto (and equivalent) tail index estimators. It focuses on univariate estimators for nontruncated data. We discuss basic ideas of these estimators and provide their analytical expressions. As samples from heavy-tailed distributions are analysed by researchers from various fields of science, the paper provides nontechnical explanations of the methods, which could be understood by researchers with intermediate skills in statistics. We also discuss strengths and weaknesses of the estimators, if they are known. The paper can be viewed as a catalog or a reference book on Pareto-tail index estimators.
    Keywords: Heavy tails, Pareto distribution, tail index, review, extreme value index
    JEL: C13 C14 C18 C58
    Date: 2018–11–15
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:90072&r=ecm
  9. By: Marco Bee; Julien Hambuckers; Luca Trapin
    Abstract: TThe g-and-h distribution is a flexible model with desirable theoretical properties. Especially, it is able to handle well the complex behavior of loss data and it is suitable for VaR estimation when large skewness and kurtosis are at stake. However, parameter estimation is di cult, because the density cannot be written in closed form. In this paper we develop an indirect inference method using the skewed- t distribution as instrumental model. We show that the skewed-t is a well suited auxiliary model and study the numerical issues related to its implementation. A Monte Carlo analysis and an application to operational losses suggest that the indirect inference estimators of the parameters and of the VaR outperform the quantile-based estimators.
    Keywords: Value-at-Risk; g-and-h distribution; loss model; indirect infer- ence; simulation; intractable likelihood
    JEL: C15 C46 C51 G22
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:trn:utwprg:2018/08&r=ecm
  10. By: Luigi Grossi (University of Verona); Fany Nan (European Commission's Joint Research Centre (JRC))
    Abstract: In this paper a robust approach to modelling electricity spot prices is introduced. Differently from what has been recently done in the literature on electricity price forecasting, where the attention has been mainly drawn by the prediction of spikes, the focus of this contribution is on the robust estimation of nonlinear SETARX models (Self-Exciting Threshold Auto Regressive models with eXogenous regressors). In this way, parameters estimates are not, or very lightly, influenced by the presence of extreme observations and the large majority of prices, which are not spikes, could be better forecasted. A Monte Carlo study is carried out in order to select the best weighting function for Generalized M-estimators of SETAR processes. A robust procedure to select and estimate nonlinear processes for electricity prices is introduced, including robust tests for stationarity and nonlinearity and robust information criteria. The application of the procedure to the Italian electricity market reveals the forecasting superiority of the robust GM-estimator based on the polynomial weighting function respect to the non-robust Least Squares estimator. Finally, the introduction of external regressors in the robust estimation of SETARX processes contributes to the improvement of the forecasting ability of the model.
    Keywords: Electricity Price, Nonlinear Time Series, Price Forecasting, Robust GM-Estimator, Spikes, Threshold Models
    JEL: C13 C15 C22 C53 Q47
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:ieb:wpaper:doc2018-10&r=ecm
  11. By: Gil-Alana, Luis A.; Yaya, OlaOluwa S
    Abstract: In this paper we present a testing procedure for fractional orders of integration in the context of non-linear terms approximated by Fourier functions. The procedure is a natural extension of the linear method proposed in Robinson (1994) and similar to the one proposed in Cuestas and Gil-Alana (2016) based on Chebyshev polynomials in time. The test statistic has an asymptotic standard normal distribution and several Monte Carlo experiments conducted in the paper show that it performs well in finite samples. Various applications using real life time series, such as US unemployment rates, US GNP and Purchasing Power Parity (PPP) of G7 countries are presented at the end of the paper.
    Keywords: Fractional unit root; Chebyshev polynomial; Monte Carlo simulation; Nonlinearity; Smooth break; Fourier transform
    JEL: C12 C15 C22
    Date: 2018–11–16
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:90516&r=ecm
  12. By: Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: We discuss some conceptual and practical issues that arise from the presence of global energy balance effects on station level adjustment mechanisms in dynamic panel regressions with climate data. The paper provides asymptotic analyses, observational data computations, and Monte Carlo simulations to assess the use of various estimation methodologies, including standard dynamic panel regression and cointegration techniques that have been used in earlier research. The findings reveal massive bias in system GMM estimation of the dynamic panel regression parameters, which arise from fixed effect heterogeneity across individual station level observations. Difference GMM and Within Group (WG) estimation have little bias and WG estimation is recommended for practical implementation of dynamic panel regression with highly disaggregated climate data. Intriguingly from an econometric perspective and importantly for global policy analysis, it is shown that despite the substantial differences between the estimates of the regression model parameters, estimates of global transient climate sensitivity (of temperature to a doubling of atmospheric CO {2}) are robust to the estimation method employed and to the specific nature of the trending mechanism in global temperature, radiation, and CO {2}.
    Keywords: Climate modeling, Cointegration, Difference GMM, Dynamic panel, Spatio-temporal modeling, System GMM, Transient climate sensitivity, Within group estimation
    JEL: C32 C33
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2150&r=ecm
  13. By: Lorenzo Trapani
    Abstract: We propose a procedure to decide between the null hypothesis of (strict) stationarity and the alternative of non-stationarity, in the context of a Random Coefficient AutoRegression (RCAR). The procedure is based on randomising a diagnostic which diverges to positive infinity under the null, and drifts to zero under the alternative. Thence, we propose a randomised test which can be used directly and - building on it - a decision rule to discern between the null and the alternative. The procedure can be applied under very general circumstances: albeit developed for an RCAR model, it can be used in the case of a standard AR(1) model, without requiring any modifications or prior knowledge. Also, the test works (again with no modification or prior knowledge being required) in the presence of infinite variance, and in general requires minimal assumptions on the existence of moments.
    Keywords: Random Coefficient AutoRegression, Stationarity, Unit Root, Heavy Tails, Randomised Tests.
    URL: http://d.repec.org/n?u=RePEc:not:notgts:18/02&r=ecm
  14. By: Carlo Fezzi; Luca Mosetti
    Abstract: Electricity price forecasting models are typically estimated via rolling windows, i.e. by using only the most recent observations. Nonetheless, the current literature does not provide much guidance on how to select the size of such windows. This paper shows that determining the appropriate window prior to estimation dramatically improves forecasting performances. In addition, it proposes a simple two-step approach to choose the best performing models and window sizes. The value of this methodology is illustrated by analyzing hourly datasets from two large power markets with a selection of ten different forecasting models. Incidentally, our empirical application reveals that simple models, such as the linear regression, can perform surprisingly well if estimated on extremely short samples.
    Keywords: electricity price forecasting, day-ahead market, parameter instability, bandwidth selection, artificial neural networks
    JEL: C22 C45 C51 C53 Q47
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:trn:utwprg:2018/10&r=ecm
  15. By: Kukacka, Jiri; Jang, Tae-Seok; Sacht, Stephen
    Abstract: In this paper, we introduce the simulated maximum likelihood method for identifying behavioral heuristics of heterogeneous agents in the baseline three-equation New Keynesian model. The method is extended to multivariate macroeconomic optimization problems, and the estimation pro-cedure is applied to empirical data sets. This approach considerably relaxes restrictive theoretical assumptions and enables a novel estimation of the intensity of choice parameter in discrete choice. In Monte Carlo simulations, we analyze the properties and behavior of the estimation method, which provides important information on the behavioral parameters of the New Keynesian model. However, the curse of dimensionality arises via a consistent downward bias for idiosyncratic shocks. Our empirical results show that the forward-looking version of both the behavioral and the rational model specifications exhibits good performance. We identify potential sources of misspecification for the hybrid version. A novel feature of our analysis is that we pin down the switching parameter for the intensity of choice for the Euro Area and US economy.
    Keywords: Behavioral Heuristics,Intensity of Choice,Monte Carlo Simulations,New-Keynesian Model,Simulated Maximum Likelihood
    JEL: C53 D83 E12 E32
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:cauewp:201811&r=ecm
  16. By: Antonio Pacifico
    Abstract: The paper develops empirical implementations of the standard time-varying Panel Bayesian VAR model to deal with confounding and latent effects. Bayesian computations and mixed hierarchical distributions are used to generate posteriors of conditional impulse responses and conditional forecasts. An empirical application to Eurozone countries illustrates the functioning of the model. A survey on policy recommendations and business cycles convergence are also conducted. The paper would enhance the more recent studies to evaluate idiosyncratic business cycles, policy-making, and structural spillovers forecasting. The analysis confirms the importance to separate common shocks from propagation of country- and variable-specific shocks.
    Keywords: Hierarchical Mixture Distributions in Normal Linear Model; Bayesian Model Averaging; Panel VAR; Forecasting; Structural Spillovers; MCMC Implementations.
    JEL: A2 D1 D2
    Date: 2018–12–15
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2018_15&r=ecm
  17. By: Tan, Fei
    Abstract: This article is concerned with frequency-domain analysis of dynamic linear models under the hypothesis of rational expectations. We develop a unified framework for conveniently solving and estimating these models. Unlike existing strategies, our starting point is to obtain the model solution entirely in the frequency domain. This solution method is applicable to a wide class of models and permits straightforward construction of the spectral density for performing likelihood-based inference. To cope with potential model uncertainty, we also generalize the well-known spectral decomposition of the Gaussian likelihood function to a composite version implied by several competing models. Taken together, these techniques yield fresh insights into the model’s theoretical and empirical implications beyond what conventional time-domain approaches can offer. We illustrate the proposed framework using a prototypical new Keynesian model with fiscal details and two distinct monetary-fiscal policy regimes. The model is simple enough to deliver an analytical solution that makes the policy effects transparent under each regime, yet still able to shed light on the empirical interactions between U.S. monetary and fiscal policies along different frequencies.
    Keywords: solution method, analytic function, Bayesian inference, spectral density, monetary and fiscal policy
    JEL: C32 C51 C52 C65 E63 H63
    Date: 2018–10–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:90487&r=ecm
  18. By: Tue Gorgens; Allan H. Würtz
    Abstract: This note considers the estimation of dynamic threshold regression models with fixed effects using short panel data. We examine a two-step method, where the threshold parameter is estimated nonparametrically at the N-rate and the remaining parameters are estimated by GMM at the √N-rate. We provide simulation results that illustrate the potential advantages of the new method in comparison with pure GMM estimation. The simulations also highlight the importance the choice of instruments in GMM estimation.
    JEL: C23 C24 C26
    Date: 2018–04
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2018-665&r=ecm
  19. By: Pablo Montero-Manso; George Athanasopoulos; Rob J Hyndman; Thiyanga S Talagala
    Abstract: We propose an automated method for obtaining weighted forecast combinations using time series features. The proposed approach involves two phases. First, we use a collection of time series to train a meta-model to assign weights to various possible forecasting methods with the goal of minimizing the average forecasting loss obtained from a weighted forecast combination. The inputs to the meta-model are features extracted from each series. In the second phase, we forecast new series using a weighted forecast combination where the weights are obtained from our previously trained meta-model. Our method outperforms a simple forecast combination, and outperforms all of the most popular individual methods in the time series forecasting literature. The approach achieved second position in the M4 competition.
    Keywords: time series feature, forecast combination, XGBoost, M4 competition, meta-learning.
    JEL: C10 C14 C22
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2018-19&r=ecm
  20. By: Tingting Cheng; Jiti Gao; Yayi Yan
    Abstract: In this paper, we propose a state-varying endogenous regime switching model (the SERS model), which includes the endogenous regime switching model by Chang et al. (2017), the CCP model, as a special case. To estimate the unknown parameters involved in the SERS model, we propose a maximum likelihood estimation method. Monte Carlo simulation results show that in the absence of state-varying endogeneity, the SERS model and the CCP model have similar performance, while in the presence of state-varying endogeneity, the SERS model performs much better than the CCP model. Finally, we use the SERS model to analyze the China stock market returns and our empirical results show that there exists strongly state-varying endogeneity in volatility switching for the Shanghai Composite Index returns. Moreover, the SERS model can indeed produce a much more realistic assessment for the regime switching process than the one obtained by the CCP model.
    Keywords: latent factor, maximum likelihood estimation, Markov chain, regime switching models, state-varying endogeneity.
    JEL: C22 C32
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2018-9&r=ecm
  21. By: Lajos Horvath; Lorenzo Trapani
    Abstract: We propose a test to discern between an ordinary autoregressive model, and a random coefficient one. To this end, we develop a full- edged estimation theory for the variances of the idiosyncratic innovation and of the random coefficient, based on a two-stage WLS approach. Our results hold irrespective of whether the series is stationary or nonstationary, and, as an immediate result, they afford the construction of a test for "relevant" randomness. Further, building on these results, we develop a randomised test statistic for the null that the coefficient is non-random, as opposed to the alternative of a standard RCA(1) model. Monte Carlo evidence shows that the test has the correct size and very good power for all cases considered. MSC 2010 subject classifications: Primary 62G10, 62H25; secondary 62M10.
    Keywords: Random Coefficient AutoRegression, WLS estimator, randomised test.
    URL: http://d.repec.org/n?u=RePEc:not:notgts:18/03&r=ecm
  22. By: Loann D. Desboulets (Aix-Marseille Univ., CNRS, EHESS, Centrale Marseille, AMSE); Costin Protopopescu (Aix-Marseille Univ., CNRS, EHESS, Centrale Marseille, AMSE)
    Abstract: We propose a new correlation measure for functionally correlated variables based on local linear dependence. It is able to detect non-linear, non-monotonic and even implicit relationships. Applying the classical linear correlation in a local framework combined with tools from Principal Components Analysis the statistic is capable of detecting very complex dependences among the data. In a first part we prove that it meets the properties of independence, similarity invariance and dependence and the axiom of continuity. In a second part we run a numerical simulation over a variety of dependences and compare it to other dependence measures in the literature. The results indicate that we outperform existing coefficients. We also show better stability and robustness to noise.
    Keywords: local correlation, Pearson coefficient, PCA, non-parametric statistic, implicit dependence, non-monotonic, non-linear
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:aim:wpaimx:1853&r=ecm
  23. By: García Suaza, Andrés Felipe; Delgado González, Miguel Ángel
    Abstract: We propose standardization techniques for the duration distribution in a population with respect to another taken as standard using right censored data, which forms a basis for counterfactual comparisons between distributional features of interest. Alternative standardizations are based on either a proportional hazard semiparametric specification or a nonparametric specification of the underlying conditional distribution. Applications to the restricted mean survival time and the hazard rate are discussed in detail. The proposal is applied to the counterfactual analysis of spells of unemployment duration gender gaps in Spain between 2004-2007. The behavior in small samples is investigated using Monte Carlo experiments.
    Keywords: Gender gaps; Spells of unemployment; RMST; Kaplan-Meier; Proportional hazard; Standardization; Right Censoring
    JEL: J64 C41 C24 C14
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:27821&r=ecm
  24. By: Mario Forni; Luca Gambetti; Luca Sala
    Abstract: The testing procedure suggested in Canova and Sahneh (2018) is essentially the same as the one proposed in Forni and Gambetti (2014), the only one difference being the use of Geweke, Meese and Dent (1983) version of Sims (1972) test in place of a standard Granger causality test. The two procedures produce similar results, both for small and large samples, and perform remarkably well in detecting non-fundamentalness. Neither methods have anything to do with the problem of aggregation. A “structural aggregate model” does not exist.
    Keywords: Non-fundamentalness, Granger causality, aggregation, structural VAR
    JEL: C32 E32
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:mod:recent:139&r=ecm
  25. By: Qihui Chen; Zheng Fang
    Abstract: This paper develops a general framework for conducting inference on the rank of an unknown matrix $\Pi_0$. A defining feature of our setup is the null hypothesis of the form $\mathrm H_0: \mathrm{rank}(\Pi_0)\le r$. The problem is of first order importance because the previous literature focuses on $\mathrm H_0': \mathrm{rank}(\Pi_0)= r$ by implicitly assuming away $\mathrm{rank}(\Pi_0)
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1812.02337&r=ecm
  26. By: Christopher J. Ruhm
    Abstract: This paper examines potential tradeoffs between research methods in answering important questions versus providing more cleanly identified estimates on problems that are potentially of lesser interest. The strengths and limitations of experimental and quasi-experimental methods are discussed and it is postulated that confidence in the results obtained may sometimes be overvalued compared to the importance of the topics addressed. The consequences of this are modeled and several suggestions are provided regarding possible steps to encourage greater focus on questions of fundamental importance.
    JEL: A11 B4 C50 H0 I0 J0 O0
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:25320&r=ecm
  27. By: Henry, Miguel; Mittelhammer, Ron; Loomis, John
    Abstract: This study applies an information theoretic econometric approach in the form of a new maximum likelihood-minimum power divergence (ML-MPD) semi-parametric binary response estimator to analyze dichotomous contingent valuation data. The ML-MPD method estimates the underlying behavioral decision process leading to a person’s willingness to pay for river recreation site attributes. Empirical choice probabilities, willingness to pay measures for recreation site attributes, and marginal effects of changes in some explanatory variables are estimated. For comparison purposes, a Logit model is also implemented. A Wald test of the symmetric logistic distribution underlying the Logit model is rejected at the 0.01 level in favor of the ML-MPD distribution model. Moreover, based on several goodness-of-fit measures we find that the ML-MPD is superior to the Logit model. Our results also demonstrate the potential for substantially overstating the precision of the estimates and associated inferences when the imposition of unknown structural information is not accounted explicitly for in the model. The ML-MPD model provides more intuitively reasonable and defensible results regarding the valuation of river recreation than the Logit model.
    Keywords: Minimum power divergence, contingent valuation, binary response models, information theoretic econometrics, river recreation
    JEL: C14 C5 Q5
    Date: 2018–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:89842&r=ecm
  28. By: Sung Jae Jun; Sokbae Lee
    Abstract: We set up an econometric model of persuasion and study identification of key parameters under various scenarios of data availability. We find that a commonly used measure of persuasion does not estimate the persuasion rate of any population in general. We provide formal identification results, recommend several new parameters to estimate, and discuss their interpretation. Further, we propose efficient estimators of the two central parameters. We revisit two strands of the empirical literature on persuasion to show that the persuasive effect is highly heterogeneous and studies based on binary instruments provide limited information about the average persuasion rate in a heterogeneous population.
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1812.02276&r=ecm
  29. By: Loann D. Desboulets (Aix-Marseille Univ., CNRS, EHESS, Centrale Marseille, AMSE)
    Abstract: In this paper we investigate on Multivariate GARCH models to assess the co-movements between stock prices of american firms listed on main markets and fundamentals. Co-movements can be seen as correlations. The latter are usually estimated via standard GARCH models such as the Dynamic Conditional Correlation (Engle, 2002) or the Baba-Engle-Kraft-Kroner (Baba et al., 1990). Nevertheless more flexible models such as the Orthogonal GARCH of Alexander (2001) can be used as well. We also introduce a new Semi-parametric Orthogonal GARCH as a natural non-linear extension of the Orthogonal GARCH. A Montecarlo simulation is conducted to evaluate finite sample performance of each model before applying them to the data. Empirical results show evidence that during crises, prices are less correlated with fundamentals that in normal periods.
    Keywords: non-parametric, Multivariate GARCH, dynamic correlation, PCA
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:aim:wpaimx:1851&r=ecm
  30. By: Loann D. Desboulets (Aix-Marseille Univ., CNRS, EHESS, Centrale Marseille, AMSE)
    Abstract: In this paper, we investigate on 39 Variable Selection procedures to give an overview of the existing literature for practitioners. "Let the data speak for themselves" has become the motto of many applied researchers since the amount of data has significantly grew. Automatic model selection have been raised by the search for data-driven theories for quite a long time now. However while great extensions have been made on the theoretical side still basic procedures are used in most empirical work, eg. Stepwise Regression. Some reviews are already available in the literature for variable selection, but always focus on a specific topic like linear regression, groups of variables or smoothly varying coefficients. Here we provide a review of main methods and state-of-the art extensions as well as a topology of them over a wide range of model structures (linear, grouped, additive, partially linear and non-parametric). We provide explanations for which methods to use for different model purposes and what are key differences among them. We also review two methods for improving variable selection in the general sense.
    Keywords: variable selection, automatic modelling, sparse models
    JEL: C50 C59
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:aim:wpaimx:1852&r=ecm
  31. By: Gunnhildur H. Steinbakk; Alex Lenkoski; Ragnar Bang Huseby; Anders L{\o}land; Tor Arne {\O}ig{\aa}rd
    Abstract: Accurate forecasts of electricity spot prices are essential to the daily operational and planning decisions made by power producers and distributors. Typically, point forecasts of these quantities suffice, particularly in the Nord Pool market where the large quantity of hydro power leads to price stability. However, when situations become irregular, deviations on the price scale can often be extreme and difficult to pinpoint precisely, which is a result of the highly varying marginal costs of generating facilities at the edges of the load curve. In these situations it is useful to supplant a point forecast of price with a distributional forecast, in particular one whose tails are adaptive to the current production regime. This work outlines a methodology for leveraging published bid/ask information from the Nord Pool market to construct such adaptive predictive distributions. Our methodology is a non-standard application of the concept of error-dressing, which couples a feature driven error distribution in volume space with a non-linear transformation via the published bid/ask curves to obtain highly non-symmetric, adaptive price distributions. Using data from the Nord Pool market, we show that our method outperforms more standard forms of distributional modeling. We further show how such distributions can be used to render `warning systems' that issue reliable probabilities of prices exceeding various important thresholds.
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1812.02433&r=ecm
  32. By: Deepankar Basu (Department of Economics, University of Massachusetts - Amherst)
    Abstract: In this paper I discuss three issues related to bias of OLS estimators in a general multivariate setting. First, I discuss the bias that arises from omitting relevant variables. I offer a geometric interpretation of such bias and derive su
    Keywords: omitted variable; irrelevant variables; ordinary least squares; bias
    JEL: C20
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:ums:papers:2018-19&r=ecm

This nep-ecm issue is ©2018 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.