
on Econometrics 
By:  Christoph Breunig; ; ; 
Abstract:  This paper proposes a test for missing at random (MAR). The MAR assumption is shown to be testable given instrumental variables which are independent of response given potential outcomes. A nonparametric testing procedure based on integrated squared distance is proposed. The statistic’s asymptotic distribu tion under the MAR hypothesis is derived. We demonstrate that our results can be easily extended to a test of missing completely at random (MCAR) and miss ing completely at random conditional on covariates X (MCAR(X)). A Monte Carlo study examines finite sample performance of our test statistic. An empirical illustration concerns pocket prescription drug spending with missing values; we reject MCAR but fail to reject MAR. 
Keywords:  Incomplete data, missingdata mechanism, selection model, nonparametric hypothesis testing, consistent testing, instrumental variable, series estimation. 
JEL:  C12 C14 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2015016&r=ecm 
By:  Marian Vavra (National Bank of Slovakia, Research Department) 
Abstract:  This paper considers the problem of testing for normality of the marginal law of univariate and multivariate stationary and weakly dependent random processes using a bootstrapbased AndersonDarling test statistic. The finitesample properties of the test are assessed via Monte Carlo experiments. An application to the inflation forecast errors is also presented. 
Keywords:  testing for normality; AndersonDarling statistic; sieve bootstrap; weak dependence 
JEL:  C12 C15 C32 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:svk:wpaper:1031&r=ecm 
By:  Bauwens, Luc (Université catholique de Louvain, CORE, Belgium); Carpantier, JeanFrançois (CREA, University of Luxembourg); Dufays, Arnaud (Université catholique de Louvain, CORE, Belgium) 
Abstract:  Markovswitching models are usually specified under the assumption that all the parameters change when a regime switch occurs. Relaxing this hypothesis and being able to detect which parameters evolve over time is relevant for interpreting the changes in the dynamics of the series, for specifying models parsimoniously, and may be helpful in forecasting. We propose the class of sticky infinite hidden Markovswitching autoregressive moving average models, in which we disentangle the break dynamics of the mean and the variance parameters. In this class, the number of regimes is possibly infinite and is determined when estimating the model, thus avoiding the need to set this number by a model choice criterion. We develop a new Markov chain Monte Carlo estimation method that solves the path dependence issue due to the moving average component. Empirical results on macroeconomic series illustrate that the proposed class of models dominates the model with fixed parameters in terms of point and density forecasts. 
Keywords:  ARMA, Bayesian inference, Dirichlet process, Forecasting, Marko vswitching 
JEL:  C11 C15 C22 C53 C58 
Date:  2015–02–13 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2015007&r=ecm 
By:  Masamune Iwasawa (Graduate School of Economics, Kyoto University, Research Fellow of Japan Society for the Promotion of Science) 
Abstract:  Estimation results obtained by parametric models may be seriously misleading when the model is misspecified or poorly approximates the true model. This study proposes two tests that jointly test the specifications of multiple response probabilities in unordered multino mial choice models. Both test statistics are asymptotically chisquare distributed, consistent against a fixed alternative, and able to detect a local alternative approaching to the null at a rate slower than the parametric rate. We show that rejection regions can be calculated by a simple parametric bootstrap procedure, when the sample size is small. The size and power of the tests are investigated by Monte Carlo experiments. 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:kyo:wpaper:919&r=ecm 
By:  Javier Hidalgo; Marcia M Schafgans 
Abstract:  This paper is concerned with various issues related to inference in large dynamic panel data models (where both n and T increase without bound) in the presence of, possibly, strong crosssectional dependence. Our first aim is to provide a Central Limit Theorem for estimators of the slope parameters of the model under mild conditions. To that end, we extend and modify existing results available in the literature. Our second aim is to study two, although similar, tests for breaks/homogeneity in the time dimension. The first test is based on the CUSUM principle; whereas the second test is based on a HausmanDurbinWu approach. Some of the key features of the tests are that they have nontrivial power when the number of individuals, for which the slope parameters may differ, is a "negligible" fraction or when the break happens to be towards the end of the sample. Due to the fact that the asymptotic distribution of the tests may not provide a good approximation for their finite sample distribution, we describe a simple bootstrap algorithm to obtain (asymptotic) valid critical values for our statistics. An important and surprising feature of the bootstrap is that there is no need to know the underlying model of the crosssectional dependence, and hence the bootstrap does not require to select any bandwidth parameter for its implementation, as is the case with moving block bootstrap methods which may not be valid with crosssectional dependence and may depend on the particular ordering of the individuals. Finally, we present a MonteCarlo simulation analysis to shed some light on the small sample behaviour of the tests and their bootstrap analogues. 
Keywords:  Large panel data, dynamic models, crosssectional strongdependence, central limit theorems, homogeneity, bootstrap algorithms 
JEL:  C12 C13 C23 
Date:  2015–04 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/583&r=ecm 
By:  Yuki Ikeda (Graduate School of Economics, The University of Tokyo); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo); Muni S. Srivastava (Department of Statistics, University of Toronto) 
Abstract:  The problem of estimating the large covariance matrix of both normal and nonnormal distributions is addressed. In convex combinations of the sample covarianceã€€matrix and the identity matrix multiplied by a scalor statistic, we suggest a newã€€estimator of the optimal weight based on exact or approximately unbiased estimators of the numerator and denominator of the optimal weight in nonnormal cases.ã€€ã€€It is also demonstrated that the estimators given in the literature have secondorder biases. It is numerically shown that the proposed estimator has a good riskã€€performance.  
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2015cf970&r=ecm 
By:  Yu, Chao; Fang, Yue; Zhao, Xujie; Zhang, Bo 
Abstract:  This paper considers the problem of estimating spot volatility in the simultaneous presence of Lévy jumps and market microstructure noise. We propose to use the preaveraging approach and the threshold kernelbased method to construct a spot volatility estimator, which is robust to both microstructure noise and jumps of either finite or infinite activity. The estimator is consistent and asymptotically normal, with a fast convergence rate. Our estimator is general enough to include many existing kernelbased estimators as special cases. When the kernel bandwidth is fixed, our estimator leads to widely used estimators of integrated volatility. Monte Carlo simulations show that our estimator works very well. 
Keywords:  highfrequency data, spot volatility, Lévy jump, kernel estimation, microstructure noise, preaveraging 
JEL:  C13 C58 
Date:  2013–03–23 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:63293&r=ecm 
By:  Peter Robinson; Francesca Rossi 
Abstract:  In a panel data model with fixed effects, possible crosssectional dependence is investigated in a spatial autoregressive setting. An Edgeworth expansion is developed for the maximum likelihood estimate of the spatial correlation coefficient. The expansion is used to develop more accurate interval estimates for the coefficient, and tests for crosssectional independence that have better size properties, than corresponding rules of statistical inference based on first order asymptotic theory. Comparisons of finite sample performance are carried out using Monte Carlo simulations. 
Keywords:  Panel data; Fixed effects; Spatial autoregression; Edgeworth expansion; Interval estimates; Tests for crosssectional independence 
JEL:  C12 C21 C31 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:ehl:lserod:61432&r=ecm 
By:  Xiaohong Chen (Cowles Foundation, Yale University); Timothy Christensen (Dept. of Economics, Yale University) 
Abstract:  This paper makes several contributions to the literature on the important yet difficult problem of estimating functions nonparametrically using instrumental variables. First, we derive the minimax optimal supnorm convergence rates for nonparametric instrumental variables (NPIV) estimation of the structural function h_0 and its derivatives. Second, we show that a computationally simple sieve NPIV estimator can attain the optimal supnorm rates for h_0 and its derivatives when h_0 is approximated via a spline or wavelet sieve. Our optimal supnorm rates surprisingly coincide with the optimal L^2norm rates for severely illposed problems, and are only up to a [log(n)]^epsilon (with epsilon < 1/2) factor slower than the optimal L^2norm rates for mildly illposed problems. Third, we introduce a novel datadriven procedure for choosing the sieve dimension optimally. Our datadriven procedure is supnorm rateadaptive: the resulting estimator of h_0 and its derivatives converge at their optimal supnorm rates even though the smoothness of h_0 and the degree of illposedness of the NPIV model are unknown. Finally, we present two nontrivial applications of the supnorm rates to inference on nonlinear functionals of h_0 under lowlevel conditions. The first is to derive the asymptotic normality of sieve tstatistics for exact consumer surplus and deadweight loss functionals in nonparametric demand estimation when prices, and possibly incomes, are endogenous. The second is to establish the validity of a sieve score bootstrap for constructing asymptotically exact uniform confidence bands for collections of nonlinear functionals of h_0. Both applications provide new and useful tools for empirical research on nonparametric models with endogeneity. 
Keywords:  Illposed inverse problems, Series 2SLS, Optimal supnorm convergence rates, Adaptive estimation, Random matrices, Bootstrap uniform confidence bands, Nonlinear welfare functionals, Nonparametric demand analysis with endogeneity 
JEL:  C13 C14 C32 
Date:  2013–11 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1923r&r=ecm 
By:  Helmut Lütkepohl; Aleksei Netsunajev 
Abstract:  A growing literature uses changes in residual volatility for identifying structural shocks in vector autoregressive (VAR) analysis. A number of different models for heteroskedasticity or conditional heteroskedasticity are proposed and used in applications in this context. This study reviews the different volatility models and points out their advantages and drawbacks. It thereby enables researchers wishing to use identification of structural VAR models via heteroskedasticity to make a more informed choice of a suitable model for a specific empirical analysis. An application investigating the interaction between U.S. monetary policy and the stock market is used to illustrate the related issues. 
Keywords:  Structural vector autoregression, identification via heteroskedasticity, conditional heteroskedasticity, smooth transition, Markov switching, GARCH 
JEL:  C32 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1464&r=ecm 
By:  Van Bellegem, Sébastien (Université catholique de Louvain, CORE, Belgium); Florens, JeanPierre (Toulouse School of Economics) 
Abstract:  In an increasing number of empirical studies, the dimensionality measured e.g. as the size of the parameter space of interest, can be very large. Two instances of large dimensional models are the linear regression with a large number of covariates and the estimation of a regression function with many instrumental variables. An appropriate setting to analyze high dimensional problems is provided by a functional linear model, in which the covariates belong to Hilbert spaces. This paper considers the case where covariates are endogenous and assumes the existence of instrumental variables (that are functional as well). The paper shows that estimating the regression function is a linear illposed inverse problem, with a known but datadependent operator. The first contribution is to analyze the rate of convergence of the penalized least squares estimator. Based on the result, we discuss the notion of “instrument strength” in the high dimensional setting. We also discuss a generalized version of the estimator, when the problem is premultiplied by an instrumentdependent operator. This extends the technology of Generalized Method of Moments to high dimensional, functional data. A central limit theorem is also established on the inner product of the estimator. The studied estimators are easy and fast to implement, and the finitesample performance is discussed through simulations and an application to the impact of agespecific fertility rate curves on yearly economic growth in the United Kingdom. 
Keywords:  inventory routing, valid inequalities, cutting planes 
JEL:  C26 C14 
Date:  2014–09–30 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2014056&r=ecm 
By:  Jeffrey S. Racine 
Abstract:  Local polynomial regression is extremely popular in applied settings. Recent developments in shape constrained nonparametric regression allow practitioners to impose constraints on local polynomial estimators thereby ensuring that the resulting estimates are consistent with underlying theory. However, it turns out that local polynomial derivative estimates may fail to coincide with the analytic derivative of the local polynomial regression estimate which can be problematic, particularly in the context of shape constrained estimation. In such cases practitioners might prefer to instead use analytic derivatives along the lines of those proposed in the local constant setting by Rilstone & Ullah (1989). Demonstrations and applications are considered. 
Keywords:  nonparametric, smoothing, constrained estimation 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:mcm:deptwp:201502&r=ecm 
By:  Jisu Yoon (GeorgAugustUniversity Göttingen); Tatyana Krivobokova (GeorgAugustUniversity Göttingen) 
Abstract:  This paper reviews various treatments of nonmetric variables in Partial Least Squares (PLS) and Principal Component Analysis (PCA) algorithms. The performance of different treatments is compared in the extensive simulation study under several typical data generating processes and recommendations are made. An application of PLS and PCA algorithms with nonmetric variables to the generation of a wealth index is considered. 
Keywords:  Principal Component Analysis; PCA; Partial Least Squares; PLS; nonmetric variables; simulation; wealth index 
JEL:  C15 C43 R20 
Date:  2015–03–27 
URL:  http://d.repec.org/n?u=RePEc:got:gotcrc:172&r=ecm 
By:  Seisho Sato (Faculty of Economics, The University of Tokyo); Naoto Kunitomo (Faculty of Economics, The University of Tokyo) 
Abstract:  For estimating the integrated volatility by using high frequency data, Kunitomo and Sato (2008, 2011, 2013) have proposed the Separating Information Maximum Likelihood (SIML) method when there are micromarket noises. The SIML estimator has reasonable nite sample properties and asymptotic properties when the sample size is large under reasonable conditions. We show that the SIML estimator has the robustness properties in the sense that it is consistent and has the stable convergence (i.e. the asymptotic normality in the deterministic case) when there are round off errors and micromarket price adjustments and noises for the underlying (continuous time) stochastic process. The SIML estimation has also reasonable nite sample properties with these effects and dominate the existing methods such as the realized kernel method and the preaveraging method in some situations.  
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2015cf964&r=ecm 
By:  Hafner, Christian M. (Université catholique de Louvain, CORE, Belgium); Breitung, Jörg (University of Cologne) 
Abstract:  Nowcasting volatility of financial time series appears difficult with classical volatility models. This paper proposes a simple model, based on an ARMA representation of the logtransformed squared returns, that allows to estimate current volatility, given past and current returns, in a very simple way. The model can be viewed as a degenerate case of the stochastic volatility model with perfect correlation between the two error terms. It is shown that the volatility nowcasts do not depend on this correlation, so that both models provide the same nowcasts for given parameter values. A simulation study suggests that the ARMA and SV models have a similar performance, but that in cases of moderate persistence the ARMA model is preferable. An extension of the ARMA model is proposed that takes into account the socalled leverage effect. Finally, the alternative models are applied to a long series of daily S&P 500 returns. 
Keywords:  EGARCH, stochastic volatility, ARMA, realized volatility 
JEL:  C22 C58 
Date:  2014–11–19 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2014060&r=ecm 
By:  Heiner Mikosch (KOF Swiss Economic Institute, ETH Zurich, Switzerland); Stefan Neuwirth (KOF Swiss Economic Institute, ETH Zurich, Switzerland) 
Abstract:  This paper presents a MIDAS type mixed frequency VAR forecasting model. First, we propose a general and compact mixed frequency VAR framework using a stacked vector approach. Second, we integrate the mixed frequency VAR with a MIDAS type Almon lag polynomial scheme which is designed to reduce the parameter space while keeping models flexible. We show how to recast the resulting nonlinear MIDAS type mixed frequency VAR into a linear equation system that can be easily estimated. A pseudo outofsample forecasting exercise with US realtime data yields that mixed frequency VAR substantially improves predictive accuracy upon a standard VAR for different VAR specifications. Forecast errors for, e.g., GDP growth decrease by 30 to 60 percent for forecast horizons up to six months and by around 20 percent for a forecast horizon of one year. 
Keywords:  Forecasting, mixed frequency data, MIDAS, VAR, real time 
JEL:  C53 E27 
Date:  2015–04 
URL:  http://d.repec.org/n?u=RePEc:kof:wpskof:15377&r=ecm 
By:  Bonanno, Graziella; De Giovanni, Domenico; Domma, Filippo 
Abstract:  In this paper, we study the socalled “wrong skewness” anomaly in Stochastic Frontiers (SF), which consists in the observed difference between the expected and estimated sign of the asymmetry of the composite error. We propose a more general and flexible specification of the SF model, introducing dependence between the two error components and asymmetry (positive or negative) of the random error. This respecification allows us to decompose the third moment of the composite error in three components, namely: i) the asymmetry of the inefficiency term; ii) the asymmetry of the random error; and iii) the structure of dependence between the error components. This decomposition suggests that the “wrong skewness” anomaly is an illposed problem, because we cannot establish ex ante the expected sign of the asymmetry of the composite error. We report a relevant special case that allows us to estimate the three components of the asymmetry of the composite error and, consequently, to interpret the estimated sign. We present two empirical applications. In the first dataset, where the classic SF displays wrong skewness, estimation of our model rejects the dependence hypothesis, but accepts the asymmetry of the random error, thus justifying the sign of the skewness of the composite error. In the second dataset, where the classic SF does not display any anomaly, estimation of our model provides evidence of the presence of both dependence between the error components and asymmetry of the random error. 
Keywords:  Keywords: Stochastic frontier models, Skewness, Generalised Logistic distribution, Dependence, Copula functions. 
JEL:  C13 C18 C46 D24 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:63429&r=ecm 
By:  Braione, Manuela (Université catholique de Louvain, CORE, Belgium); Scholtes, Nicolas K. (Université catholique de Louvain, CORE, Belgium) 
Abstract:  Financial asset returns are known to be conditionally heteroskedastic and generally nonnormally distributed, fattailed and often skewed. In order to account for both the skewness and the excess kurtosis in returns, we combine the BEKK model from the multivariate GARCH literature with different multivariate densities for the returns. The set of distributions we consider comprises the normal, Student, Multivariate Exponential Power and their skewed counterparts. Applying this framework to a sample of ten assets from the Dow Jones Industrial Average Index, we compare the performance of equally weighted portfolios derived from the symmetric and skewed distributions in forecasting outofsample ValueatRisk. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures. The results unanimously show that the inclusion of fattailed densities into the model specification yields more accurate VaR forecasts, while the further addition of skewness does not lead to significant improvements. 
Keywords:  Dow Jones industrial average, BEKK model, maximum likelihood, valueatrisk 
JEL:  C01 C22 C52 C58 
Date:  2014–11–18 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2014059&r=ecm 
By:  Ellis Scharfenaker (Department of Economics, New School for Social Research) 
Abstract:  The distribution of prot rates in the U.S. economy for 21,714 rms from 1962  2012 appears to be highly organized in a Laplacelike distribution. Pos itive prot rate deviations from the mode appear to be remarkably stationary over time displaying little parametric changes while negative prot rate devi ations introduce an asymmetry into the distribution that appears to uctuate over time. In this paper I propose a model of \classically" competitive rms facing informational entropy constraints in their decisions to potentially enter or exit markets based on prot rate dierentials. The result is a three parameter logit quantal response distribution for rm entry and exit decisions. Bayesian methods are used for inference into the the distribution of entry and exit deci sions conditional on prot rate deviations and rm level data from Compustat is used to test these predictions. The model parameters show a uctuating asymmetry in rm exit decisions, an increase in dispersion of negative prot rate dierentials, and a falling general rate of prot. 
Keywords:  Firm competition, Laplace distribution, Gibbs sampler, prot rate, statistical equilibrium, rational inattention, information theory, quantal response 
JEL:  C10 C15 D20 D22 E10 L11 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:new:wpaper:1507&r=ecm 
By:  BAUWENS, Luc (Université catholique de Louvain, CORE, Belgium); BRAIONE, Manuela (Université catholique de Louvain, CORE, Belgium); STORTI, Giuseppe (Université di Salerno) 
Abstract:  Novel model specifications that include a timevarying long run component in the dynamics of realized covariance matrices are proposed. The adopted modeling framework allows the secular component to enter the model structure either in an additive fashion or as a multiplicative factor, and to be specified parametrically, using a MIDAS filter, or nonparametrically. Estimation is performed by maximizing a Wishart quasilikelihood function. The onestep ahead forecasting performance of the models is assessed by means of three approaches: the Model Confidence Set, (global) minimum variance portfolios and ValueatRisk. The results provide evidence in favour of the hypothesis that the proposed models outperform benchmarks incorporating a constant long run component, both in and outof sample. 
Keywords:  Realized covariance, component dynamic models, MIDAS, minimum variance portfolio, Model Confidence Set, ValueatRisk 
JEL:  C13 C32 C58 
Date:  2014–11–30 
URL:  http://d.repec.org/n?u=RePEc:cor:louvco:2014053&r=ecm 
By:  Dube, Arindrajit (University of Massachusetts Amherst); Zipperer, Ben (University of Massachusetts Amherst) 
Abstract:  We propose a simple, distributionfree method for pooling synthetic control case studies using the mean percentile rank. We also test for heterogeneous treatment effects using the distribution of estimated ranks, which has a known form. We propose a crossvalidation based procedure for model selection. Using 29 cases of state minimum wage increases between 1979 and 2013, we find a sizable, positive and statistically significant effect on the average teen wage. We do detect heterogeneity in the wage elasticities, consistent with differential bites in the policy. In contrast, the employment estimates suggest a small constant effect not distinguishable from zero. 
Keywords:  synthetic controls, program evaluation, heterogeneous treatment effects, minimum wage 
JEL:  J38 J23 J88 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp8944&r=ecm 
By:  Aliprantis, Dionissi (Federal Reserve Bank of Cleveland) 
Abstract:  Structural Causal Models define causal effects in terms of a single Data Generating Process (DGP), and the Rubin Causal Model defines causal effects in terms of a model that can represent counterfactuals from many DGPs. Under these different definitions, notationally similar causal effects make distinct claims about the results of interventions to the system under investigation: Structural equations imply conditional independencies in the data that potential outcomes do not. One implication is that the DAG of a Rubin Causal Model is different from the DAG of a Structural Causal Model. Another is that Pearl’s docalculus does not apply to potential outcomes and the Rubin Causal Model. 
Keywords:  Structural Equation; Potential Outcome; Invariance; Autonomy 
JEL:  C00 C01 C31 
Date:  2015–03–27 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:1505&r=ecm 
By:  Yaman, F.; Cubí‐Mollá, P. 
Abstract:  In many contexts reported outcomes in a rating scale are modeled through the existence of a latent variable that separates the categories through thresholds. The literature has not been able to separate the effect of a variable on the latent variable from its effect on threshold parameters. We propose a model which incorporates (1) individual fixed effects on the latent variable, (2) individual fixed effects on the thresholds and (3) threshold shifts across time depending on observable. Importantly, the latent variable and the threshold specifications can include common variables. In order to illustrate the estimator, we apply it to a model of life satisfaction using the GSOEP dataset. We demonstrate that important differences can arise depending on the choice of the model. Our model suggests that threshold shifts are statistically and quantitatively important. Factors which increase reported lifesatisfaction are due both to positive effects on the latent variable AND to shifting thresholds to the left, while factors which decrease reported life satisfaction are due to negative effects on the latent variable AND to shifting thresholds to the right. 
Keywords:  Ordered choice; fixed effects; subjective wellbeing; lifesatisfaction 
URL:  http://d.repec.org/n?u=RePEc:cty:dpaper:8123&r=ecm 
By:  Giovanni Angelini; Luca Fanelli Fanelli 
Abstract:  This paper focuses on the dynamic misspecification that characterizes the class of smallscale NewKeynesian models and provides a `natural' remedy for the typical difficulties these models have in accounting for the rich contemporaneous and dynamic correlation structure of the data, generally faced with ad hoc shock specifications. We suggest using the `best fitting' statistical model for the data as a device through which it is possible to adapt the econometric specification of the NewKeynesian model. The statistical model may feature an autocorrelation structure that is more involved than the autocorrelation structure implied by the structural model's reduced form solution under rational expectations, and it is treated as the actual agents' expectations generating mechanism. A pseudostructural form is built from the baseline system of Euler equations by forcing the state vector of the system to have the same dimension as the state vector characterizing the statistical model. We provide an empirical illustration based on U.S. quarterly data and a smallscale monetary New Keynesian model. 
Keywords:  Dynamic stochastic general equilibrium model, Expectations, Kalman filter, New Keynesian models, State space model. 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:bot:quadip:wpaper:125&r=ecm 