nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒06‒07
nine papers chosen by
Sune Karlsson
Orebro University

  1. Semiparametric Localized Bandwidth Selection in Kernel Density Estimation By Tingting Cheng; Jiti Gao; Xibin Zhang
  2. On the Finite Sample Properties of Pre-test Estimators of Spatial Models By Gianfranco Piras; Ingmar R. Prucha
  3. Factor High-Frequency Based Volatility (HEAVY) Models By Kevin Sheppard
  4. Partial Mean Processes with Generated Regressors: Continuous Treatment Effects and Nonseparable Models By Ying-Ying Lee
  5. Semiparametric Model Selection in Panel Data Models with Deterministic Trends and Cross-Sectional Dependence By Jia Chen; Jiti Gao
  6. Factor Vector Autoregressive Estimation of Heteroskedastic Persistent and Non Persistent Processes Subject to Structural Breaks By Claudio Morana
  7. Estimation of the Global Minimum Variance Portfolio in High Dimensions By Taras Bodnar; Nestor Parolya; Wolfgang Schmid
  8. The Misspecification of Expectations in New Keynesian Models: A DSGE-VAR Approach By Stephen Cole; Fabio Milani
  9. Sensitivity of Value at Risk estimation to NonNormality of returns and Market capitalization By Sinha, Pankaj; Agnihotri, Shalini

  1. By: Tingting Cheng; Jiti Gao; Xibin Zhang
    Abstract: Since conventional cross–validation bandwidth selection methods do not work for the case where the data considered are serially dependent, alternative bandwidth selection methods are needed. In recent years, Bayesian based global bandwidth selection methods have been proposed. Our experience shows that the use of a global bandwidth is however less suitable than using a localized bandwidth in kernel density estimation in the case where the data are serially dependent. Nonetheless, a difficult issue is how we can consistently estimate a localized bandwidth. In this paper, we propose a semiparametric estimation method and establish an asymptotic theory for the proposed estimator. A by–product of this bandwidth estimate is a new sampling–based likelihood approach to hyperparameter estimation. Monte Carlo simulation studies show that the proposed hyperparameter estimation method works very well, and that the proposed bandwidth estimator outperforms its competitors. Applications of the new bandwidth estimator to the kernel density estimation of Eurodollar deposit rate, as well as the S&P 500 daily return under conditional heteroscedasticity, demonstrate the effectiveness and competitiveness of the proposed semiparametric localized bandwidth.
    Keywords: hyperparameter estimation; likelihood score; localized bandwidth.
    JEL: C13 C14 C21
    Date: 2014
  2. By: Gianfranco Piras (Regional Research Institute, West Virginia University); Ingmar R. Prucha (Department of Economics, University of Maryland)
    Abstract: This paper explores the properties of pre-tst strategies in estimating a linear Cliff-Ord -type spatial model when the researcher is unsure about the nature of the spatial dependence. More specifically, the paper explores the finite sample properties of the pre-test estimators introduced in Florax et al. (2003), which are based on Lagrange Multiplier (LM) tests, within the context of a Monte Carlo study. The performance of those estimators is compared with that of the maximum likelihood (ML) estimator of the encompassing model. We find that, even in a very simple setting, the bias of the estimates generated by pre-testing strategies can be very large in some cases and the empirical size of tests can differ substantially from the nominal size. This is in contrast to the ML estimator.
    Keywords: cliff-ord, spatial, model, lagrange multiplier, monte carlo
    JEL: C4 C5
    Date: 2013–07
  3. By: Kevin Sheppard
    Abstract: �We propose a new class of multivariate volatility models utilizing realized measures of asset volatility and covolatility extracted from high-frequency data. Dimension reduction for estimation of large covariance matrices is achieved by imposing a factor structure with time-varying conditional factor loadings. Statistical properties of the model, including conditions that ensure covariance stationary or returns, are established. The model is applied to modeling the conditional covariance data of large U.S. financial institutions during the financial crisis, where empirical results show that the new model has both superior in- and out-of-sample properties. We show that the superior performance applies to a wide range of quantities of interest, including volatilities, covolatilities, betas and scenario-based risk measures, where the model's performance is particularly strong at short forecast horizons. �
    Keywords: Conditional Beta, Conditional Covariance, Forecasting, HEAVY, Marginal Expected Shortfall, Realized Covariance, Realized Kernel, Systematic Risk
    JEL: C32 C53 C58 G17 G21
    Date: 2014–05–30
  4. By: Ying-Ying Lee
    Abstract: Partial mean processes with generated regressors arise in several important econometric problems, such as the distribution of potential outcomes with continuous treatments and the quantile structural function in a nonseparable triangular model.� This paper proposes a fully nonparametric estimator for the partial mean process, where the second step consists of a kernel regression on regressors that are estimated in the first step.� The main contribution is a uniform expansion that characterizes in detail how the estimation error associated with the generated regressor affects the limiting distribution of the marginal integration estimator.� The general results are illustrated with three examples: control variables in triangular models (Newey, Powell, and Vella, 1999; Imbens and Newey, 2009), the generalized propensity score for a continuus treatment (Hirano and Imbens, 2004), and the propensity score for sample selection (Das, Newey, and Vella, 2003).
    Keywords: Continuous treatment, partial means, nonseparable models, generated regressors, control function
    JEL: C13 C14 C31
    Date: 2014–05–13
  5. By: Jia Chen; Jiti Gao
    Abstract: In this paper, we consider a model selection issue in semiparametric panel data models with fixed effects. The modelling framework under investigation can accommodate both nonlinear deterministic trends and cross-sectional dependence. And we consider the so-called “large panels†where both the time series and cross sectional sizes are very large. A penalised profile least squares method with first-stage local linear smoothing is developed to select the significant covariates and estimate the regression coefficients simultaneously. The convergence rate and the oracle property of the resulting semiparametric estimator are established by the joint limit approach. The developed semiparametric model selection methodology is illustrated by two Monte-Carlo simulation studies, where we compare the performance in model selection and estimation of three penalties, i.e., the least absolute shrinkage and selection operator (LASSO), the smoothly clipped absolute deviation (SCAD), and the minimax concave penalty (MCP).
    Keywords: Cross-sectional dependence, fixed effects, large panel, local linear fitting, penalty function, profile likelihood, semiparametric regression.
    JEL: C13 C14 C23
    Date: 2014
  6. By: Claudio Morana
    Abstract: In the paper a general framework for large scale modeling of macroeconomic and financial time series is introduced. The proposed approach is characterized by simplicity of implementation, performing well independently of persistence and heteroskedasticity properties, accounting for common deterministic and stochastic factors. Monte Carlo results strongly support the proposed methodology, validating its use also for relatively small cross-sectional and temporal samples.
    Keywords: long and short memory, structural breaks, common factors, principal components analysis, fractionally integrated heteroskedastic factor vector autoregressive model
    JEL: C22
    Date: 2014–05
  7. By: Taras Bodnar; Nestor Parolya; Wolfgang Schmid
    Abstract: We estimate the global minimum variance (GMV) portfolio in the high-dimensional case using results from random matrix theory. This approach leads to a shrinkage-type estimator which is distribution-free and it is optimal in the sense of minimizing the out-of-sample variance. Its asymptotic properties are investigated assuming that the number of assets $p$ depends on the sample size $n$ such that $\frac{p}{n}\rightarrow c\in (0,+\infty)$ as $n$ tends to infinity. The results are obtained under weak assumptions imposed on the distribution of the asset returns, namely it is only required the fourth moments existence. Furthermore, we make no assumption on the upper bound of the spectrum of the covariance matrix. As a result, the theoretical findings are also valid if the dependencies between the asset returns are described by a factor model which appears to be very popular in financial literature nowadays. This is also well-documented in a numerical study where the small- and large-sample behavior of the derived estimator are compared with existing estimators of the GMV portfolio. The resulting estimator shows significant improvements and it turns out to be robust to the deviations from normality.
    Date: 2014–06
  8. By: Stephen Cole (Department of Economics, University of California-Irvine); Fabio Milani (Department of Economics, University of California-Irvine)
    Abstract: This paper tests the ability of popular New Keynesian models, which are traditionally used to study monetary policy and business cycles, to match the data regarding a key channel for monetary transmission: the dynamic interactions between macroeconomic variables and their corresponding expectations. In the empirical analysis, we exploit direct data on expectations from surveys. To explain the joint evolution of realized variables and expectations, we adopt a DSGE-VAR approach, which allows us to estimate all models in the continuum between the extremes of an unrestricted VAR, on one side, and a DSGE model in which the cross-equation restrictions are dogmatically imposed, on the other side. Moreover, the DSGE-VAR approach allows us to assess the extent, as well as the main sources, of misspecification in the model. The paper's results illustrate the failure of New Keynesian models under the rational expectations hypothesis to account for the dynamic interactions between observed macroeconomic expectations and macroeconomic realizations. Confirming previous studies, DSGE restrictions prove valuable when the New Keynesian model is exempted from matching observed expectations. But when the model is required to match data on expectations, it can do so only by moving away, and hence substantially rejecting, DSGE restrictions. Finally, we investigate alternative models of expectations formation, including examples of extrapolative and heterogeneous expectations, and show that they can go some way toward reconciling the New Keynesian model with the data. Intermediate DSGE-VAR models, which avail themselves of DSGE prior restrictions, return to fit the data better than the unrestricted VAR. Hence, the results overall point to misspecification in the expectations formation side of the DSGE model, more than in the structural microfounded equations.
    Keywords: Modeling of expectations; DSGE models; Rational expectations; Observed survey expectations; Model misspecification; DSGE-VAR; Heterogeneous expectations
    JEL: C52 D84 E32 E50 E60
    Date: 2014–04
  9. By: Sinha, Pankaj; Agnihotri, Shalini
    Abstract: This paper investigates sensitivity of the VaR models when return series of stocks and stock indices are not normally distributed. It also studies the effect of market capitalization of stocks and stock indices on their Value at risk and Conditional VaR estimation. Three different market capitalized indices S&P BSE Sensex, BSE Mid cap and BSE Small cap indices have been considered for the recession and post-recession periods. It is observed that VaR violations are increasing with decreasing market capitalization in both the periods considered. The same effect is also observed on other different market capitalized stock portfolios. Further, we study the relationship of liquidity represented by volume traded of stocks and the market risk calculated by VaR of the firms. It confirms that the decrease in liquidity increases the value at risk of the firms.
    Keywords: Non-normality, market capitalization, Value at risk (VaR), CVaR, GARCH
    JEL: C51 C52 C58 G01 G20 G22 G24 G28
    Date: 2014–03–10

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.