
on Econometrics 
By:  Jinhyun Lee (University of St Andrews) 
Abstract:  This paper proposes a novel way of testing exogeneity of an explanatory variable without any parametric assumptions in the presence of a "conditional" instrumental variable. A testable implication is derived that if an explanatory variable is endogenous, the conditional distribution of the outcome given the endogenous variable is not independent of its instrumental variable(s). The test rejects the null hypothesis with probability one if the explanatory variable is endogenous and it detects alternatives converging to the null at a rate n^{1/2}. We propose a consistent nonparametric bootstrap test to implement this testable implication. We show that the proposed bootstrap test can be asymptotically justified in the sense that it produces asymptotically correct size under the null of exogeneity, and it has unit power asymptotically. Our nonparametric test can be applied to the cases in which the outcome is generated by an additively nonseparable structural relation or in which the outcome is discrete, which has not been studied in the literature. 
URL:  http://d.repec.org/n?u=RePEc:san:wpecon:1316&r=ecm 
By:  Solberger M.; Zhou X. (GSBE) 
Abstract:  We consider an exact factor model and derive a Lagrange multipliertype test for unit roots in the idiosyncratic components. The asymptotic distribution of the statistic is derived under the misspecification that the differenced factors are white noise. We prove that the asymptotic distribution is independent of the distribution of the factors, and that the factors are allowed to be integrated, cointegrate, or be stationary. In a simulation study, size and power is compared with some popular second generation panel unit root tests. The simulations suggest that our statistic is wellbehaved in terms of size and that it is powerful and robust in comparison with existing tests. 
Keywords:  Hypothesis Testing: General; Single Equation Models; Single Variables: Models with Panel Data; Longitudinal Data; Spatial Time Series; 
JEL:  C12 C23 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:dgr:umagsb:2013058&r=ecm 
By:  Zhou X.; Solberger M. (GSBE) 
Abstract:  Recent developments within the panel unitroot literature have illustrated how the exact factor model serves as a parsimonious framework and allows for consistent maximum likelihood inference even when it is misspecified contra the more general approximate factor model. In this paper we consider an exact factor model with AR1 dynamics and propose LMtype tests for idiosyncratic and common unit roots. We derive the asymptotic distributions and carry out simulations to investigate size and power of the tests in finite samples, as well as compare the performance with some existing tests. 
Keywords:  Hypothesis Testing: General; Single Equation Models; Single Variables: Models with Panel Data; Longitudinal Data; Spatial Time Series; 
JEL:  C12 C23 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:dgr:umagsb:2013059&r=ecm 
By:  Audrino, Francesco; Camponovo, Lorenzo 
Abstract:  We derive new theoretical results on the properties of the adaptive least absolute shrinkage and selection operator (adaptive lasso) for time series regression models. In particular we investigate the question of how to conduct finite sample inference on the parameters given an adaptive lasso model for some fixed value of the shrinkage parameter. Central in this study is the test of the hypothesis that a given adaptive lasso parameter equals zero, which therefore tests for a false positive. To this end we construct a simple (conservative) testing procedure and show, theoretically and empirically through extensive Monte Carlo simulations, that the adaptive lasso combines efficient parameter estimation, variable selection, and valid finite sample inference in one step. Moreover, we analytically derive a bias correction factor that is able to significantly improve the empirical coverage of the test on the active variables. Finally, we apply the introduced testing procedure to investigate the relation between the short rate dynamics and the economy, thereby providing a statistical foundation (from a model choice perspective) to the classic Taylor rule monetary policy model. 
Keywords:  Adaptive lasso; Time series; Oracle properties; Finite sample inference; Taylor rule monetary policy model. 
JEL:  C12 C22 E43 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:usg:econwp:2013:27&r=ecm 
By:  Maciejowska, Katarzyna 
Abstract:  In this article, a new approach for model specification is proposed. The method allows to choose the correct order of a mixture model by testing, if a particular mixture component is significant. The hypotheses are set in a new way, in order to avoid identification problems, which are typical for mixture models. If some of the parameters are known, the distribution of the LR statistic is Chi2, with the degrees of freedom depending on the number of components and the number of parameters in each component. The advantage of the new approach is its simplicity and computational feasibility. 
Keywords:  normal mixture models, likelihood ratio test, homogeneity test, hypotheses setting 
JEL:  C10 C12 
Date:  2013–10–16 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:50303&r=ecm 
By:  Matyas Barczy; Gyula Pap 
Abstract:  We study asymptotic properties of maximum likelihood estimators for Heston models based on continuous time observations. We distinguish three cases: subcritical (also called ergodic), critical and supercritical. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1310.4783&r=ecm 
By:  Sucarrat, Genaro; Escribano, Alvaro 
Abstract:  A critique that has been directed towards the logGARCH model is that its logvolatility specification does not exist in the presence of zero returns. A common ``remedy" is to replace the zeros with a small (in the absolute sense) nonzero value. However, this renders Quasi Maximum Likelihood (QML) estimation asymptotically biased. Here, we propose a solution to the case where actual returns are equal to zero with probability zero, but zeros nevertheless are observed because of measurement error (due to missing values, discreteness approximisation error, etc.). The solution treats zeros as missing values and handles these by combining QML estimation via the ARMA representation with the Expectationmaximisation (EM) algorithm. Monte Carlo simulations confirm that the solution corrects the bias, and several empirical applications illustrate that the biascorrecting estimator can make a substantial difference. 
Keywords:  ARCH, exponential GARCH, logGARCH, ARMA, ExpectationMaximisation (EM) 
JEL:  C22 C58 
Date:  2013–09–09 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:50699&r=ecm 
By:  Reese, Simon (Department of Economics, Lund University); Li, Yushu (Department of Business and Management Science, Norwegian School of Economics) 
Abstract:  This paper investigates how classical measurement error and additive outliers influence tests for structural change based on Fstatistics. We derive theoretically the impact of general additive disturbances in the regressors on the asymptotic distribution of these tests for structural change . The small sample properties in the case of classical measurement error and additive outliers are investigated via Monte Carlo simulations, revealing that sizes are biased upwards and that powers are reduced. Two wavelet based denoising methods are used to reduce these distortions. We show that these two methods can significantly improve the performance of structural break tests. 
Keywords:  Structural breaks; measurement error; additive outliers; wavelet transform; empirical Bayes thresholding 
JEL:  C11 C12 C15 
Date:  2013–10–11 
URL:  http://d.repec.org/n?u=RePEc:hhs:lunewp:2013_036&r=ecm 
By:  Hyungsik Roger Moon; Martin Weidner (Institute for Fiscal Studies and UCL) 
Abstract:  In this paper we study the least sqares (LS) estimator in a linear panel regression model with interactive fixed effects for asymptotics where both the number of time periods and the number of crosssectional units go to infinity. Under appropriate assumptions we show that the limiting distribution of the LS estimator for the regression coefficients is independent of the number of interactive fixed effects used in the estimation, as long as this number does not fall below the true number of interactive fixed effects present in the data. The important practical implication of this result is that for inference on the regression coefficients one does not necessarily need to estimate the number of interactive effects consistently, but can rely on an upper bound of this number to calculate the LS estimator. Supplementary material for this paper is available here. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:49/13&r=ecm 
By:  Antonia Arsova (Leuphana University Lueneburg, Germany); Deniz Dilan Karaman Oersal (Leuphana University Lueneburg, Germany) 
Abstract:  This paper proposes a new likelihoodbased panel cointegration rank test which extends the test of Oersal and Droge (2012) (henceforth Panel SL test) to allow for crosssectional dependence. The dependence is modelled by unobserved common factors which affect the variables in each crosssection through heterogeneous loadings. The common components are estimated following the panel analysis of nonstationarity in idiosyncratic and common components (PANIC) approach of Bai and Ng (2004) and the estimates are subtracted from the observations. The cointegrating rank of the defactored data is then tested by the Panel SL test. A Monte Carlo study demonstrates that the proposed testing procedure has reasonable size and power properties in finite samples. 
Keywords:  panel cointegration rank test, crosssectional dependence, common factors, likelihoodratio, time trend 
JEL:  C12 C15 C33 
Date:  2013–08 
URL:  http://d.repec.org/n?u=RePEc:lue:wpaper:280&r=ecm 
By:  Hadri, Kaddour; Kurozumi, Eiji; Rao, Yao 
Abstract:  In this paper, we propose new cointegration tests for single equations and panels. In both cases, the asymptotic distributions of the tests, which are derived with N fixed and T going to infinity, are shown to be standard normals. The effects of serial correlation and crosssectional dependence are mopped out via longrun variances. An effective bias correction is derived which is shown to work well in finite samples; particularly when N is smaller than T. Our panel tests are robust to possible cointegration across units. 
Keywords:  cointegration, panel cointegration, crosssection dependence, bias correction, DOLS, FCLT 
JEL:  C12 C15 C22 C23 
Date:  2013–09 
URL:  http://d.repec.org/n?u=RePEc:hit:econdp:201312&r=ecm 
By:  Westerlund J.; Smeekes S. (GSBE) 
Abstract:  Most panel data studies of the predictability of returns presume that the crosssectional units are independent, an assumption that is not realistic. As a response to this, the current paper develops block bootstrapbased panel predictability tests that are valid under very general conditions. Some of the allowable features include heterogeneous predictive slopes, persistent predictors, and complex error dynamics, including crossunit endogeneity. 
Keywords:  Statistical Simulation Methods: General; Single Equation Models; Single Variables: TimeSeries Models; Dynamic Quantile Regressions; Dynamic Treatment Effect Models; Single Equation Models; Single Variables: Models with Panel Data; Longitudinal Data; Spatial Time Series; Financial Crises; Asset Pricing; Trading volume; Bond Interest Rates; 
JEL:  C15 C22 C23 G01 G12 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:dgr:umagsb:2013060&r=ecm 
By:  Toru Kitagawa (Institute for Fiscal Studies and University College London) 
Abstract:  This paper develops a specification test for the instrument validity conditions in the heterogeneous treatment effect model with a binary treatment and a discrete instrument. A necessary testable implication for the joint restriction of instrument exogeneity and instrument monotonicity is given by nonnegativity of pointidentifiable complier's outcome densities. Our specification test infers this testable implication using a KolmogorovSmirnov type test statistic. We provide a bootstrap algorithm to implement the proposed test and show its asymptotic validity. The proposed test procedure can apply to both discrete and continuous outcome cases. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:53/13&r=ecm 
By:  Emilio Zanetti Chini (University of Rome "Tor Vergata") 
Abstract:  This paper introduces a variant of the smooth transition autoregression (STAR). The proposed model is able to parametrize the asymmetry in the tails of the transition equation by using a particular generalization of the logistic function. The null hypothesis of symmetric adjustment toward a new regime is tested by building two different LMtype tests. The first one maintains the original parametrization, while the second one is based on a thirdorder expanded auxiliary regression. Three diagnostic tests for no error autocorrelation, no additive asymmetry and parameter constancy are also discussed. The empirical size and power of the new symmetry as well as diagnostic tests are investigated by an extensive Monte Carlo experiment. An empirical application of the so generalized STAR (GSTAR) model to four economic time series reveals that the asymmetry in the transition between two regimes is a feature to be considered for economic analysis. 
Date:  2013–10–15 
URL:  http://d.repec.org/n?u=RePEc:rtv:ceisrp:294&r=ecm 
By:  Romuald Meango; Ismael Mourifie 
Abstract:  This paper deals with the question whether exclusion restrictions on the exogenous regressors are necessary to identify two equation probit models with endogenous dummy regressor. Contradictory opinions have been exposed in the literature on the necessity of an exclusion restriction. Wilde (2000) argued that an exclusion restriction is not necessary, and proposed a simple criterion for identification in this model. We contradict his result, and show how the inherent incompleteness of the model leads to failure of (point) identication. We provide an exact identification proof when an exclusion restriction is available. 
Keywords:  Probit model, Endogenous dummy regressor, Partial identification 
JEL:  C35 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa503&r=ecm 
By:  Otsu, Taisuke; Pesendorfer, Martin; Takahashi, Yuya 
Abstract:  This paper proposes several statistical tests for finite state Markov games to examine the null hypothesis that the data are generated from a single equilibrium. We formulate tests of (i) the conditional choice probabilities, (ii) the steadystate distribution of states and (iii) the conditional distribution of states conditional on an initial state. In a Monte Carlo study we find that the chisquared test of the steadystate distribution performs well and has high power even with a small number of markets and time periods. We apply the chisquared test to the empirical application of Ryan (2012) that analyzes dynamics of the U.S. Portland Cement industry and test if his assumption of single equilibrium is supported by the data. 
Keywords:  Dynamic Markov Game; Multiplicity of Equilibria; Testing 
JEL:  C12 C72 D44 
Date:  2013–04 
URL:  http://d.repec.org/n?u=RePEc:trf:wpaper:423&r=ecm 
By:  Jeffrey Penney (Queen's University) 
Abstract:  I derive a rigorous method to help determine whether a true parameter takes a value between two arbitrarily chosen points for a given level of confidence via a multiple testing procedure which strongly controls the familywise error rate. For any test size, the distance between the upper and lower bounds can be made smaller than that created by a confidence interval. The procedure is more powerful than other multiple testing methods that test the same hypothesis. This test can be used to provide an affirmative answer about the existence of a negligible effect. 
Keywords:  familywise error, multiple testing, null effect, partial identification, precise zero 
JEL:  C12 C18 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1319&r=ecm 
By:  Juan Carlos Escanciano; Wei Li 
Abstract:  This paper asks which aspects of a structural Nonparametric Instrumental Variables Regression (NPIVR) can be identified well and which ones cannot. It contributes to answering this question by characterising the identified set of linear continuous functionals of the NPIVR under norm constraints. Each element of the identified set of NPIVR can be written as the sum of a common 'identifiable component' and an idiosyncratic 'unidentifiable component'. The identified set for any continuous linear functional is shown to be a closed interval, whose midpoint is the functional applied to the 'identifiable component'. The formula for the length of the identified set extends the popular omitted variables formula of classical linear regression. Some examples illustrate the wide applicability and utility of our identification result, including bounds and a new identification condition for pointevaluation functionals. The main ideas are illustrated with an empirical application of the effect of children on labour market outcomes. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:48/13&r=ecm 
By:  Frank Schorfheide (Department of Economics, University of Pennsylvania); Kenneth I. Wolpin (Department of Economics, University of Pennsylvania) 
Abstract:  A recent literature has developed that combines two prominent empirical approaches to ex ante policy evaluation: randomized controlled trials (RCT) and structural estimation. The RCT provides a “goldstandard" estimate of a particular treatment, but only of that treatment. Structural estimation provides the capability to extrapolate beyond the experimental treatment, but is based on untestable assumptions and is subject to structural data mining. Combining the approaches by holding out from the structural estimation exercise either the treatment or control sample allows for external validation of the underlying behavioral model. Although intuitively appealing, this holdout methodology is not well grounded. For instance, it is easy to show that it is suboptimal from a Bayesian perspective. Using a stylized representation of a randomized controlled trial, we provide a formal rationale for the use of a holdout sample in an environment in which data mining poses an impediment to the implementation of the ideal Bayesian analysis and a numerical illustration of the potential benefits of holdout samples. 
Keywords:  Bayesian Analysis, Model Selection, PrincipalAgent Models, Randomized Controlled Trials 
JEL:  C11 C31 C52 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:13059&r=ecm 
By:  Woutersen, Tiemen; Khandker, Shahidur R. 
Abstract:  This paper proposes an estimator for the endogenous switching regression models with fixed effects. The estimator allows for endogenous selection and for conditional heteroscedasticity in the outcome equation. Applying the estimator to a dataset on the productivity in agriculture substantially changes the conclusions compared to earlier analysis of the same dataset. This paper proposes an estimator for the endogenous switching regression models with fixed effects. The estimator allows for endogenous selection and for conditional heteroscedasticity in the outcome equation. Applying the estimator to a dataset on the productivity in agriculture substantially changes the conclusions compared to earlier analysis of the same dataset. 
Keywords:  Economic Theory&Research,EBusiness,Knowledge for Development,Econometrics,Labor Policies 
Date:  2013–10–01 
URL:  http://d.repec.org/n?u=RePEc:wbk:wbrwps:6658&r=ecm 
By:  Magnus, Jan R; Vasnev, Andrey 
Abstract:  Sensitivity analysis is important for its own sake and also in combination with diagnostic testing. We consider the question how to use sensitivity statistics in practice, in particular how to judge whether sensitivity is large or small. For this purpose we distinguish between absolute and relative sensitivity and highlight the contextdependent nature of any sensitivity analysis. Relative sensitivity is then applied in the context of forecast combination and sensitivitybased weights are introduced. All concepts are illustrated through the European yield curve. In this context it is natural to look at sensitivity to autocorrelation and normality assumptions. Different forecasting models are combined with equal, fitbased and sensitivitybased weights, and compared with the multivariate and random walk benchmarks. We show that the fitbased weights and the sensitivitybased weights are complementary. For longterm maturities the sensitivitybased weights perform better than other weights. 
Date:  2013–03 
URL:  http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/8964&r=ecm 
By:  Pauwels, Laurent; Vasnev, Andrey 
Abstract:  The problem of finding appropriate weights to combine several density forecasts is an important issue currently debated in the forecast combination literature. Recently, a paper by Hall and Mitchell (IJF, 2007) proposes to combine density forecasts with optimal weights obtained from solving an optimization problem. This paper studies the properties of this optimization problem when the number of forecasting periods is relatively small and finds that it often produces corner solutions by allocating all the weight to one density forecast only. This paper's practical recommendation is to have an additional training sample period for the optimal weights. While reserving a portion of the data for parameter estimation and making pseudooutofsample forecasts are common practices in the empirical literature, employing a separate training sample for the optimal weights is novel, and it is suggested because it decreases the chances of corner solutions. Alternative logscore or quadraticscore weighting schemes do not have this training sample requirement. January 
Date:  2013–01 
URL:  http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/8932&r=ecm 
By:  Tatsuya Kubokawa (Faculty of Economics, University of Tokyo); Muni S. Srivastava (Department of Statistics, University of Toronto) 
Abstract:  ã€€ã€€ The problem of estimating the covariance matrix of normal and nonnormal distributions is addressed when both the sample size and the dimension of covariance matrix tend to innity. In this paper, we consider a class of ridgetype estimators which are linear combinations of the unbiased estimator and the identity matrix multiplied by a scalor statistic, and we derive a leading term of their risk functions relative to a quadratic loss function. Within this class, we obtain the optimal ridgetype estimator by minimizing the leading term in the risk approximation. It is interesting to note that the optimal weight is based on a statistic for testing sphericity of the covariance matrix. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2013cf906&r=ecm 
By:  Ladislav Kristoufek 
Abstract:  In this short report, we investigate the ability of the DCCA coefficient to measure correlation level between nonstationary series. Based on a wide Monte Carlo simulation study, we show that the DCCA coefficient can estimate the correlation coefficient accurately regardless the strength of nonstationarity (measured by the fractional differencing parameter $d$). For a comparison, we also report the results for the standard Pearson's correlation coefficient. The DCCA coefficient dominates the Pearson's coefficient for nonstationary series. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1310.3984&r=ecm 
By:  Paolo Giudici (Department of Economics and Management, University of Pavia); Alessandro Spelta (Department of Economics and Management, University of Pavia) 
Abstract:  The late2000s financial crisis has stressed the need of understanding the world financial system as a network of countries, where crossborder financial linkages play a fundamental role in the spread of systemic risks. Financial network models, that take into account the complex interrelationships between countries, seem to be an appropriate tool in this context. In this paper we propose to enrich the topological perspective of network models with a more structured statistical framework, that of graphical Gaussian models, which can be employed to accurately estimate the adjacency matrix, the main input for the estimation of the interconnections between different countries. We consider different types of graphical models: besides classical ones, we introduce Bayesian graphical models, that can take model uncertainty into account, and dynamic Bayesian graphical models, that provide a convenient framework to model temporal crossborder data, decomposing the model into autoregressive and contemporaneous networks. The paper shows how the application of the proposed models to the Bank of International Settlements locational banking statistics allows the identification of four distinct groups of countries, that can be considered central in systemic risk contagion. 
Keywords:  Financial network models, Graphical models, Bayesian model selection 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:pav:demwpp:demwp0052&r=ecm 
By:  Liu, Di; Murtazashvili, Irina; Prokhorov, Artem 
Abstract:  We estimate intergenerational income mobility in the USA and Sweden. To measure the degree to which income status is transmitted from one generation to another we propose a nonparametric estimator, which is particularly relevant for crosscountry comparisons. Our approach allows intergenerational mobility to vary across observable family characteristics. Furthermore, it ts situations when data on fathers and sons come from di fferent samples. Finally, our estimator is consistent in the presence of measurement error in fathers' longrun economic status. We fi nd that family background captured by fathers' education matters for intergenerational income persistence in the USA more than in Sweden suggesting that the character of inequality in the two countries is rather di fferent. 
Keywords:  GMM estimation; intergenerational income mobility 
Date:  2013–08–07 
URL:  http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/9293&r=ecm 
By:  Gianbiagio Curato; Fabrizio Lillo 
Abstract:  Large tick assets, i.e. assets where one tick movement is a significant fraction of the price and bidask spread is almost always equal to one tick, display a dynamics in which price changes and spread are strongly coupled. We introduce a Markovswitching modeling approach for price change, where the latent Markov process is the transition between spreads. We then use a finite Markov mixture of logit regressions on past squared returns to describe the dependence of the probability of price changes. The model can thus be seen as a Double Chain Markov Model. We show that the model describes the shape of return distribution at different time aggregations, volatility clustering, and the anomalous decrease of kurtosis of returns. We calibrate our models on Nasdaq stocks and we show that this model reproduces remarkably well the statistical properties of real data. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1310.4539&r=ecm 
By:  Yonghong An (Department of Economics, University of Connecticut); Xun Tang (Department of Economics, University of Pennsylvania) 
Abstract:  We study the nonparametric identification and estimation of a structural model for committee decisions. Members of a committee share a common information set, but differ in ideological bias while processing multiple information sources and in individual tastes while weighing multiple objectives. We consider two cases of the model where committee members have or don't have strategic incentives for making recommendations that conform with the committee decision. For both cases, purestrategy Bayesian Nash equilibria exist, and we show how to use variations in the common information set to recover the distribution of members' private types from individual recommendation patterns. Building on the identification result, we estimate a structural model of interest rate decisions by the Monetary Policy Committee (MPC) at the Bank of England. We find some evidence that recommendations from external committee members are less distorted by strategic incentives than internal members. There is also evidence that MPC members differ more in their tastes for multiple objectives than in ideological bias. 
Keywords:  Committee decisions, nonparametric identification, MPC at the Bank of England 
JEL:  C14 D71 
Date:  2013–10–07 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:13058&r=ecm 
By:  Ron Gallant (Institute for Fiscal Studies and Duke University); Raffaella Giacomini (Institute for Fiscal Studies and UCL); Giuseppe Ragusa 
Abstract:  The contribution of generalized method of moments (Hansen and Singleton, 1982) was to allow frequentist inference regarding the parameters of a nonlinear structural model without having to solve the model, provided there were no latent variables. The contribution of this paper is the same with latent variables. 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:50/13&r=ecm 
By:  Martin Gremm 
Abstract:  We propose a random walk model of asset returns where the parameters depend on market stress. Stress is measured by, e.g., the value of an implied volatility index. We show that model parameters including standard deviations and correlations can be estimated robustly and that all distributions are approximately normal. Fat tails in observed distributions occur because time series sample different stress levels and therefore different normal distributions. This provides a quantitative description of the observed distribution including the fat tails. We discuss simple applications in risk management and portfolio construction 
Date:  2013–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1310.4538&r=ecm 
By:  Kole, H.J.W.G.; Dijk, D.J.C. van 
Abstract:  The state of the equity market, often referred to as a bull or a bear market, is of key importance for financial decisions and economic analyses. Its latent nature has led to several methods to identify past and current states of the market and forecast future states. These methods encompass semiparametric rulebased methods and parametric regimeswitching models. We compare these methods by new statistical and economic measures that take into account the latent nature of the market state. The statistical measure is based directly on the predictions, while the economic mea sure is based on the utility that results when a riskaverse agent uses the predictions in an investment decision. Our application of this framework to the S&P500 shows that rulebased methods are preferable for (insample) identification of the market state, but regimeswitching models for (outofsample) forecasting. Insample only the direction of the market matters, but for forecasting both means and volatilities of returns are important. Both the statistical and the economic measures indicate that these differences are significant. 
Keywords:  forecast evaluation;regime switching;stock market;economic comparison 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:dgr:eureri:1765041558&r=ecm 