Operations Research
http://lists.repec.orgmailman/listinfo/nep-ore
Operations Research
2017-05-21
Honest confidence sets in nonparametric IV regression and other ill-posed models
http://d.repec.org/n?u=RePEc:tse:wpaper:31687&r=ore
This paper provides novel methods for inference in a very general class of ill-posed models in econometrics, encompassing the nonparametric instrumental regression, different functional regressions, and the deconvolution. I focus on uniform confidence sets for the parameter of interest estimated with Tikhonov regularization, as in Darolles, Fan, Florens, and Renault (2011). I first show that it is not possible to develop inferential methods directly based on the uniform central limit theorem. To circumvent this difficulty I develop two approaches that lead to valid confidence sets. I characterize expected diameters and coverage properties uniformly over a large class of models (i.e. constructed confidence sets are honest). Finally, I illustrate that introduced confidence sets have reasonable width and coverage properties in samples commonly used in applications with Monte Carlo simulations and considering application to Engel curves.
Babii, Andrii
nonparametric instrumental regression, functional linear regression, density deconvolution, honest uniform confidence sets, non-asymptotic inference, ill-posed models, Tikhonov regularization
2017-05
Discretizing Nonlinear, Non-Gaussian Markov Processes with Exact Conditional Moments
http://d.repec.org/n?u=RePEc:pra:mprapa:78981&r=ore
Approximating stochastic processes by finite-state Markov chains is useful for reducing computational complexity when solving dynamic economic models. We provide a new method for accurately discretizing general Markov processes by matching low order moments of the conditional distributions using maximum entropy. In contrast to existing methods, our approach is not limited to linear Gaussian autoregressive processes. We apply our method to numerically solve asset pricing models with various underlying stochastic processes for the fundamentals, including a rare disasters model. Our method outperforms the solution accuracy of existing methods by orders of magnitude, while drastically simplifying the solution algorithm. The performance of our method is robust to parameters such as the number of grid points and the persistence of the process.
Farmer, Leland
Toda, Alexis Akira
asset pricing models, duality, Kullback-Leibler information, numerical methods, solution accuracy
2016-11-01
The Fiction of Full BEKK
http://d.repec.org/n?u=RePEc:ems:eureir:99514&r=ore
The purpose of the paper is to show that univariate GARCH is not a special case of multivariate GARCH, specifically the Full BEKK model, except under parametric restrictions on the off-diagonal elements of the random coefficient autoregressive coefficient matrix, provides the regularity conditions that arise from the underlying random coefficient autoregressive process, and for which the (quasi-) maximum likelihood estimates have valid asymptotic properties under the appropriate parametric restrictions. The paper provides a discussion of the stochastic processes, regularity conditions, and asymptotic properties of univariate and multivariate GARCH models. It is shown that the Full BEKK model, which in practice is estimated almost exclusively, has no underlying stochastic process, regularity conditions, or asymptotic properties.
Chang, C-L.
McAleer, M.J.
Random coefficient stochastic process, Off-diagonal parametric restrictions, Diagonal and Full BEKK, Regularity conditions, Asymptotic properties, Conditional volatility, Univariate and multivariate models
2017-01-15
New Bid-Ask Spread Estimators from Daily High and Low Prices
http://d.repec.org/n?u=RePEc:pra:mprapa:79102&r=ore
In this paper, we introduce two low frequency bid-ask spread estimators using daily high and low transaction prices. The range of mid-prices is an increasing function of the sampling interval, while the bid-ask spread and the relationship between trading direction and the mid-price are not constrained by it and are therefore independent. Monte Carlo simulations and data analysis from the equity and foreign exchange markets demonstrate that these models significantly out-perform the most widely used low-frequency estimators, such as those proposed in Corwin and Schultz (2012) and most recently in Abdi and Ranaldo (2017). We illustrate how our models can be applied to deduce historical market liquidity in NYSE, UK, Hong Kong and the Thai stock markets. Our estimator can also effectively act as a gauge for market volatility and as a measure of liquidity risk in asset pricing.
Li, Zhiyong
Lambe, Brendan
Adegbite, Emmanuel
High-low spread estimator; effective spread; transaction cost; market liquidity
2017-05
Forecasting the Volatility of Nikkei 225 Futures
http://d.repec.org/n?u=RePEc:ems:eureir:99517&r=ore
For forecasting volatility of futures returns, the paper proposes an indirect method based on the relationship between futures and the underlying asset for the returns and time-varying volatility. For volatility forecasting, the paper considers the stochastic volatility model with asymmetry and long memory, using high frequency data for the underlying asset. Empirical results for Nikkei 225 futures indicate that the adjusted R2 supports the appropriateness of the indirect method, and that the new method based on stochastic volatility models with the asymmetry and long memory outperforms the forecasting model based on the direct method using the pseudo long time series.
Asai, M.
McAleer, M.J.
Forecasting, Volatility, Futures, Realized Volatility, Realized Kernel, Leverage Effects, Long Memory
2017-01-15
Directionally Differentiable Econometric Models
http://d.repec.org/n?u=RePEc:yon:wpaper:2017rwp-103&r=ore
The current paper examines the limit distribution of the quasi-maximum likelihood estimator obtained from a directionally differentiable quasi-likelihood function and represents its limit distribution as a functional of a Gaussian stochastic process indexed by direction. In this way, the standard analysis that assumes a differentiable quasi-likelihood function is treated as a special case of our analysis. We also examine and redefine the standard quasi-likelihood ratio, Wald, and Lagrange multiplier test statistics so that their null limit behaviors are regular under our model framework.
JIN SEO CHO
HALBERT WHITE
directionally differentiable quasi-likelihood function, Gaussian stochastic process, quasilikelihood ratio test, Wald test, and Lagrange multiplier test statistics.
2017-04
A New Way to Quantify the Effect of Uncertainty
http://d.repec.org/n?u=RePEc:fip:feddwp:1705&r=ore
This paper develops a new method to quantify the effects of uncertainty using estimates from a nonlinear New Keynesian model. The model includes an occasionally binding zero lower bound constraint on the nominal interest rate, which creates time-varying endogenous uncertainty, and two exogenous types of time-varying uncertaintyâ€”a volatility shock to technology growth and a volatility shock to the risk premium. A filtered third-order approximation of the Euler equation shows consumption uncertainty on average reduced consumption by about 0.06% and the peak effect was 0.15% during the Great Recession. Other higher-order moments such as inflation uncertainty, technology growth uncertainty, consumption skewness, and inflation skewness had smaller
Richter, Alexander
Throckmorton, Nathaniel
Baysian estimation; uncertainty; stochastic volatility; zero lower bound
2017-05-04
Distribution of residuals in the nonparametric IV model with application to separability testing
http://d.repec.org/n?u=RePEc:tse:wpaper:31686&r=ore
We develop a uniform asymptotic expansion for the empirical distribution function of residuals in the nonparametric IV regression. Such expansion opens a door for construction of a broad range of residual-based specification tests in nonparametric IV models. Building on obtained result, we develop a test for the separability of unobservables in econometric models with endogeneity. The test is based on verifying the independence condition between residuals of the NPIV estimator and the instrument and can distinguish between the non-separable and the separable specification under endogeneity.
Babii, Andrii
Florens, Jean-Pierre
separability test, distribution of residuals, nonparametric instrumental regression,Sobolev scales
2017-05
Bagged artificial neural networks in forecasting inflation: An extensive comparison with current modelling frameworks
http://d.repec.org/n?u=RePEc:nbp:nbpmis:262&r=ore
Accurate inflation forecasts lie at the heart of effective monetary policy. By utilizing a thick modelling approach, this paper investigates the out-of-sample quality of the short-term Polish headline inflation forecasts generated by a combination of thousands of bagged single hidden-layer feed-forward artificial neural networks in the period of systematically falling and persistently low inflation. Results indicate that the forecasts from this model outperform a battery of popular approaches, especially at longer horizons. During the excessive disinflation it has more accurately accounted for the slowly evolving local mean of inflation and remained only mildly biased. Moreover, combining several linear and nonlinear approaches with diverse underlying model assumptions delivers further statistically significant gains in the predictive accuracy and statistically outperforms a panel of examined benchmarks at multiple horizons. The robustness analysis shows that resigning from data preprocessing and bootstrap aggregating severely compromises the forecasting ability of the model.
Karol Szafranek
inflation forecasting, artificial neural networks, principal components, bootstrap aggregating, forecast combination
2017
Fat Tails and Spurious Estimation of Consumption-Based Asset Pricing Models
http://d.repec.org/n?u=RePEc:pra:mprapa:78980&r=ore
The standard generalized method of moments (GMM) estimation of Euler equations in heterogeneous-agent consumption-based asset pricing models is inconsistent under fat tails because the GMM criterion is asymptotically random. To illustrate this, we generate asset returns and consumption data from an incomplete-market dynamic general equilibrium model that is analytically solvable and exhibits power laws in consumption. Monte Carlo experiments suggest that the standard GMM estimation is inconsistent and susceptible to Type II errors (incorrect non-rejection of false models). Estimating an overidentified model by dividing agents into age cohorts appears to mitigate Type I and II errors.
Toda, Alexis Akira
Walsh, Kieran James
consumption-based CAPM, generalized method of moments, heterogeneous-agent model, power law
2016-11-17
Huggett Economies with Multiple Stationary Equilibria
http://d.repec.org/n?u=RePEc:pra:mprapa:78984&r=ore
I obtain a closed-form solution to a Huggett economy with CARA utility when the vector of individual state variables follows a VAR(1) process with an arbitrary shock distribution. The stationary equilibrium is unique if the income process is AR(1), but not necessarily so otherwise. With Gaussian shocks, I provide general sufficient conditions for the existence of at least three equilibria when the income process is either ARMA(1,1), AR(2), or has a persistent-transitory (PT) representation with negatively correlated shocks. The possibility of multiple equilibria calls for caution in comparative statics exercises and policy analyses using heterogeneous-agent models.
Toda, Alexis Akira
CARA utility, income fluctuation problem, persistent-transitory representation
2017-03-13