
on Econometrics 
By:  Jiti Gao; Maxwell King 
Abstract:  This paper proposes a simple and improved nonparametric unitroot test. An asymptotic distribution of the proposed test is established. Finite sample comparisons with an existing nonparametric test are discussed. Some issues about possible extensions are outlined. 
Keywords:  Autoregression, nonparametric unit?root test, nonstationary time series, specification testing. 
JEL:  C12 C14 C22 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201216&r=ecm 
By:  Pang Du; Christopher F. Parmeter; Jeffrey S. Racine 
Abstract:  Nonparametric smoothing under shape constraints has recently received much welldeserved attention. Powerful methods have been proposed for imposing a single shape constraint such as monotonicity and concavity on univariate functions. In this paper, we extend the monotone kernel regression method in Hall and Huang (2001) to the multivariate and multiconstraint setting. We impose equality and/or inequality constraints on a nonparametric kernel regression model and its derivatives. A bootstrap procedure is also proposed for testing the validity of the constraints. Consistency of our constrained kernel estimator is provided through an asymptotic analysis of its relationship with the unconstrained estimator. Theoretical underpinnings for the bootstrap procedure are also provided. Illustrative Monte Carlo results are presented and an application is considered. 
Keywords:  shape restrictions, nonparametric regression, multivariate kernel estimation, hypothesis testing 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:mcm:deptwp:201208&r=ecm 
By:  Joseph P. Romano; Azeem M. Shaikh; Michael Wolf 
Abstract:  This paper considers the problem of testing a finite number of moment inequalities. We propose a twostep approach. In the first step, a confidence region for the moments is constructed. In the second step, this set is used to provide information about which moments are “negative.” A Bonferonnitype correction is used to account for the fact that with some probability the moments may not lie in the confidence region. It is shown that the test controls size uniformly over a large class of distributions for the observed data. An important feature of the proposal is that it remains computationally feasible, even when the number of moments is very large. The finitesample properties of the procedure are examined via a simulation study, which demonstrates, among other things, that the proposal remains competitive with existing procedures while being computationally more attractive. 
Keywords:  Bonferonni inequality, bootstrap, moment inequalities, partial identification, uniform validity 
JEL:  C12 C14 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:zur:econwp:090&r=ecm 
By:  Giuseppe Cavaliere (Department of Statistical Sciences, University of Bologna); Anders Rahbek (Department of Economics, University of Copenhagen and CREATES); A.M.Robert Taylor (School of Economics and Granger Centre for Time Series Econometrics, University of Nottingham) 
Abstract:  In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo) likelihood ratio [PLR] cointegration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap resampling scheme and establish the validity of their proposed bootstrap procedures in the context of a cointegrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap resampling scheme, when timevarying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under conditional heteroskedasticity but only for the latter under unconditional heteroskedasticity. Monte Carlo evidence is reported which suggests that the bootstrap approach of Cavaliere et al. (2012) signi?cantly improves upon the ?nite sample performance of corresponding procedures based on either the asymptotic PLR test or an alternative bootstrap method (where the short run dynamics in the VAR model are estimated unrestrictedly) for a variety of conditionally and unconditionally heteroskedastic innovation processes. 
Keywords:  Bootstrap, Cointegration, Trace statistic, Rank determination, heteroskedasticity. 
JEL:  C30 C32 
Date:  2012–08–31 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201236&r=ecm 
By:  Dechert, Andreas 
Abstract:  Modeling fractional cointegration relationships has become a major topic in applied time series analysis as it steps back from the traditional rigid I(1)/I(0) methodology. Hence, the number of proposed tests and approaches has grown over the last decade. The aim of this paper is to study the nonparametric variance ratio approach suggested by Nielsen for the case of fractional cointegration in presence of linear trend and trend breaks. The consideration of trend breaks is very important in order to avoid spurious fractional integration, so this possibility should be regarded by practitioners. This paper proposes to calculate pvalues by means of gamma distributions and gives response regressions parameters for the asymptotic moments of them. In Monte Carlo simulations this work compares the power of the approach against a Johansen type rank test suggested, which is robust against trend breaks but not fractional (co)integration. As the approach also obtains an estimator for the cointegration space, the paper compares it with OLS estimates in simulations. As an empirical example the validity of the market expectation hypothesis is tested for monthly Treasury bill rates ranging from 19582011, which might have a trend break around September 1979 due to change of American monetary policy. 
Keywords:  fractional integration; fractional cointegration; long memory; variance ratio; nonparametric; trend breaks; market expectation hypothesis 
JEL:  C32 E43 C14 
Date:  2012–09–04 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:41044&r=ecm 
By:  Marcelo C. Medeiros (Pontifical Catholic University of Rio de Janeiro); Eduardo F. Mendes (Pontifical Catholic University of Rio de Janeiro) 
Abstract:  We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, highdimensional, linear timeseries models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly, larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency), and has the oracle property, even when the errors are nonGaussian and conditionally heteroskedastic. A simulation study shows the method performs well in very general settings. Finally, we consider two applications: in the first one the goal is to forecast quarterly US inflation onestep ahead, and in the second we are interested in the excess return of the S&P 500 index. The method used outperforms the usual benchmarks in the literature. 
Keywords:  sparse models, shrinkage, LASSO, adaLASSO, time series, forecasting. 
JEL:  C22 
Date:  2012–09–04 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201237&r=ecm 
By:  Charley Xia and William Griffiths 
Abstract:  A Monte Carlo experiment is used to examine the size and power properties of alternative Bayesian tests for unit roots. Four different prior distributions for the root that is potentially unity – a uniform prior and priors attributable to Jeffreys, Lubrano, and Berger and Yang – are used in conjunction with two testing procedures: a credible interval test and a Bayes factor test. Two extensions are also considered: a test based on model averaging with different priors and a test with a hierarchical prior for a hyperparameter. The tests are applied to both trending and nontrending series. Our results favor the use of a prior suggested by Lubrano. Outcomes from applying the tests to some Australian macroeconomic time series are presented. 
Keywords:  N/A 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:1152&r=ecm 
By:  Shujie Ma; Jeffrey S. Racine; Lijian Yang 
Abstract:  We consider the problem of estimating a relationship nonparametrically using regression splines when there exist both continuous and categorical predictors. We combine the global properties of regression splines with the local properties of categorical kernel functions to handle the presence of categorical predictors rather than resorting to sample splitting as is typically done to accommodate their presence. The resulting estimator possesses substantially better nitesample performance than either its frequencybased peer or crossvalidated local linear kernel regression or even additive regression splines (when additivity does not hold). Theoretical underpinnings are provided and Monte Carlo simulations are undertaken to assess nitesample behavior, and two illustrative applications are provided. An implementation in R (R Core Team (2012)) is available; see the R package 'crs' for details (Racine & Nie (2012)). 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:mcm:deptwp:201206&r=ecm 
By:  Tomasz Wozniaka 
Abstract:  Recent economic developments have shown the importance of spillover and contagion effects in financial markets. Such effects are not limited to relations between the levels of financial variables but also impact on their volatility. I investigate Granger causality in conditional mean and conditional variances of time series. For this purpose a VARMAGARCH model is used. I derive parametric restrictions for the hypothesis of noncausality in conditional variances between two groups of variables, when there are other variables in the system as well. These novel conditions are convenient for the analysis of potentially large systems of economic variables. Such systems should be considered in order to avoid the problem of omitted variable bias. Further, I propose a Bayesian Lindleytype testing procedure in order to evaluate hypotheses of noncausality. It avoids the singularity problem that may appear in the Wald test. Also, it relaxes the assumption of the existence of higherorder moments of the residuals required for the derivation of asymptotic results of the classical tests. In the empirical example, I find that the dollartoEuro exchange rate does not secondorder cause the poundtoEuro exchange rate, in the system of variables containing also the Swiss franktoEuro exchange rate, which confirms the meteor shower hypothesis of Engle, Ito & Lin (1990). 
Keywords:  Granger causality, secondorder noncausality, VARMAGARCH models, Bayesian testing 
JEL:  C11 C12 C32 C53 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2012/19&r=ecm 
By:  Kim, Young Shin; Giacometti, Rosella; Rachev, Svetlozar T.; Fabozzi, Frank J.; Mignacca, Domenico 
Abstract:  In this paper, we propose a multivariate market model with returns assumed to follow a multivariate normal tempered stable distribution. This distribution, defined by a mixture of the multivariate normal distribution and the tempered stable subordinator, is consistent with two stylized facts that have been observed for asset distributions: fattails and an asymmetric dependence structure. Assuming infinitely divisible distributions, we derive closedform solutions for two important measures used by portfolio managers in portfolio construction: the marginal VaR and the marginal AVaR. We illustrate the proposed model using stocks comprising the Dow Jones Industrial Average, first statistically validating the model based on goodnessoffit tests and then demonstrating how the marginal VaR and marginal AVaR can be used for portfolio optimization using the model. Based on the empirical evidence presented in this paper, our framework offers more realistic portfolio risk measures and a more tractable method for portfolio optimization.  
Keywords:  portfolio risk,portfolio optimization,portfolio budgeting,marginal contribution,fattailed distribution,multivariate normal tempered stable distribution 
JEL:  C58 C61 G11 G32 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:zbw:kitwps:44&r=ecm 
By:  Marco Bee 
Abstract:  This paper deals with the estimation of the lognormalPareto and the lognormalGeneralized Pareto mixture distributions. The loglikelihood function is discontinuous, so that Maximum Likelihood Estimation is not asymptotically optimal. For this reason, we develop an alternative method based on Probability Weighted Moments. We show that the standard version of the method can be applied to the first distribution, but not to the latter. Thus, in the lognormal Generalized Pareto case, we work out the details of a mixed approach combining Maximum Likelihood Estimation and Probability Weighted Moments. Extensive simulations give precise indications about the relative efficiencies of the methods in various setups. Finally, we apply the techniques to two real datasets in the actuarial and operational risk management fields. 
Keywords:  Probability Weighted Moments; Mixed Estimation Method; LognormalPareto Distri bution; Loss Models 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:trn:utwpde:1208&r=ecm 
By:  Silvio Rendon (Department of Economics, Stony Brook University) 
Abstract:  I discuss the application of capturerecapture methods to estimating the total number of deaths in armed conflicts, and propose an alternative method based on a trivariate discrete choice model. Data come from the ‘Truth and Reconciliation Commission’ (TRC) of Peru, around 25000 deaths, classified by three sources of information, geographical strata, and perpetrator: the State and the Shining Path. In these data many killings have been only documented by one source, which makes a projection of killings unfeasible . TRC consultants Ball et al. (2003) tried to overcome this problem by means of a ‘residual estimation,’ consisting of merging data for different perpetrators. I show theoretically and empirically that this method overestimates the number of deaths. Using a conditional trivariate Probit I estimate the total number of deaths in around 28000, 60% by the State, 40% by the Shining Path. This number is substantially lower and has a different composition than the around 69000 deaths, 30% by the State, 46% by the Shining Path, and 24% by ‘other perpetrators,’ calculated by Ball et al. 
Keywords:  Armed Conflict, CaptureRecapture, Count Data, Discrete Choice, Human Rights, MaximumLikelihood Estimation, Poisson Regression. 
JEL:  D74 C35 C4 O54 P16 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:nys:sunysb:1203&r=ecm 
By:  Andor, Mark; Hesse, Frederik 
Abstract:  Based on the seminal paper of Farrell (1957), researchers have developed several methods for measuring efficiency. Nowadays, the most prominent representatives are nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA), both introduced in the late 1970s. Since decades, researchers have been attempting to develop a method which combines the virtues  both nonparametric and stochastic  of these oldies. The recently introduced Stochastic nonsmooth envelopment of data (StoNED) by Kuosmanen and Kortelainen (2010) is a promising method. This paper compares the StoNED method with the two oldies DEA and SFA and extends the initial Monte Carlo simulation of Kuosmanen and Kortelainen (2010) in two directions. Firstly, we consider a wider range of conditions. Secondly, we also consider the maximum likelihood estimator (ML) and the pseudolikelihood estimator (PL) for SFA and StoNED, respectively. We show that, in scenarios without noise, the rivalry is still between the oldies, while in noisy scenarios, the nonparametric StoNED PL now constitutes a promising alternative to the SFA ML.  
Keywords:  efficiency,stochastic nonsmooth envelopment of data (StoNED),data envelopment analysis (DEA),stochastic frontier analysis (SFA),monte carlo simulation 
JEL:  C1 C5 D2 L5 Q4 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:zbw:cawmdp:60&r=ecm 
By:  Shujie Ma; Jeffrey S. Racine 
Abstract:  We consider the problem of estimating a relationship using semiparametric additive regression splines when there exist both continuous and categorical regressors, some of which are irrelevant but this is not known a priori. We show that choosing the spline degree, number of subintervals, and bandwidths via crossvalidation can automatically remove irrelevant regressors, thereby delivering 'automatic dimension reduction' without the need for pretesting. Theoretical underpinnings are provided, finitesample performance is studied, and an illustrative application demonstrates the ecacy of the proposed approach in finitesample settings. An R package implementing the methods is available from the Comprehensive R Archive Network (Racine and Nie (2011)). 
Keywords:  Bspline, discrete, kernel 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:mcm:deptwp:201207&r=ecm 
By:  Manuel S. Santos (Department of Economics, University of Miami); Adrian PeraltaAlva (Research Department, Federal Reserve Bank of Saint Louis) 
Abstract:  This paper provides a general framework for the quantitative analysis of stochastic dynamic models. We review convergence properties of some numerical algorithms and available methods to bound approximation errors. We then address convergence and accuracy properties of the simulated moments. Our purpose is to provide an asymptotic theory for the computation, simulationbased estimation, and testing of dynamic economies. The theoretical analysis is complemented with several illustrative examples. We study both optimal and nonoptimal economies. Optimal economies generate smooth laws of motion defining Markov equilibria, and can be approximated by recursive methods with contractive properties. Nonoptimal economies, however, lack existence of continuous Markov equilibria, and need to be computed by other algorithms with weaker approximation properties. 
Keywords:  Stochastic Dynamic Model, Markov Equilibrium, Numerical Solution, Approximation Error, Accuracy, SimulationBased Estimation, Consistency 
JEL:  C63 C60 
Date:  2012–08–19 
URL:  http://d.repec.org/n?u=RePEc:mia:wpaper:20126&r=ecm 
By:  Huber, Martin 
Abstract:  This paper presents statistical evidence about the validity of the sibling sex ratio instrument proposed by Angrist and Evans (1998), a prominent natural “natural experiment” in the sense of Rosenzweig and Wolpin (2000). The sex ratio of the first two siblings is arguably randomly assigned and influences the probability of having a third child, which makes it a candidate instrument for fertility when estimating the effect of fertility on female labor supply. However, identification hinges on the satisfaction of the instrumental exclusion restriction and the monotonicity of fertility in the instrument, see Imbens and Angrist (1994). Using the methods of Kitagawa (2008), Huber and Mellace (2011a), and Huber and Mellace (2012), we for the first time verify the validity of the sibling sex ratio instrument by statistical hypothesis tests, which suggest that violations are small if not close to nonexistent. We also provide novel sensitivity checks to assess deviations from the exclusion restriction and/or monotonicity in the nonparametric local average treatment effect framework and find the negative labor supply effect of fertility to be robust to a plausible range of violations. 
Keywords:  instrumental variable, treatment effects, LATE, tests, sensitivity analysis. 
JEL:  C12 C21 C26 J13 J22 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:usg:econwp:2012:19&r=ecm 
By:  Gabriel Garber; Eduardo A. Haddad 
Abstract:  This paper proposes a methodology to integrate econometric models with Johansentype computable general equilibrium (CGE) models in instances when it is necessary to generate results consistent with a subset of variables that are endogenous to both models. Results for a subset of the CGE endogenous variables are generated from econometric models, and set as targets to be replicated by the CGE model. The methodology is further extended for robustness testing of the outcomes in cases which the targeted scenarios are random. The proposed methodology is illustrated by simulating the impacts of a monetary shock in Brazil. 
Keywords:  Model integration, target fitting, sensitivity analysis, CGE models, monetary 
JEL:  C63 C68 R13 R15 
Date:  2012–08–12 
URL:  http://d.repec.org/n?u=RePEc:spa:wpaper:2012wpecon14&r=ecm 
By:  Mihaela Craioveanu (University of Central Missouri); Eric Hillebrand (Aarhus University) 
Abstract:  The lag structure (1,5,21) is most commonly used for the HARRV model for realized volatility (Corsi 2009), where the terms are thought to represent a daily, a weekly, and a monthly time scale. The aggregation of the three scales approximates long mem ory. We explore flexible lag selection for the model on realized volatility constructed from ticklevel data of the thirty constituting stocks of the Dow Jones Industrial Average between 1995 and 2007. The computational costs for flexible lag selection are substantial, and we use a parallel computing environment. We find that flexible lags do not improve insample or outofsample fit. Our results therefore confirm the standard practice in a largescale data application. 
Keywords:  Time Series, Financial Econometrics, HARRV Model 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:umn:wpaper:1201&r=ecm 
By:  DUANGKAMON CHOTIKAPANICH, WILLIAM GRIFFITHS, WASANA KARUNARATHNE, D.S. PRASADA RAO 
Abstract:  Data for measuring poverty and income inequality are frequently available in a summary form that describes the proportion of income or expenditure for each of a number of population proportions. While various discrete measures can be applied directly to data in this limited form, these discrete measures typically ignore inequality within each group. This problem can be overcome by fitting a parametric income distribution to the grouped data and computing required quantities from the estimated parameters of this distribution. In this paper we show how to calculate several poverty measures from parameters of the generalized beta distribution of the second kind, and its popular special cases. An analysis of poverty changes in ten countries from South and Southeast Asia is used to illustrate the methodology. 
JEL:  I32 O15 C13 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:1154&r=ecm 
By:  Zaman, Asad 
Abstract:  Econometric Methodology is based on logical positivist principles. Since logical positivism has collapsed, it is necessary to rethink these foundations. We show that positivist methodology has led econometricians to a meaningless search for patterns in the data. An alternative methodology which relates observed patterns to real causal structures is proposed 
Keywords:  Econometric Methodology; logical positivism; realism; causality; VAR models; Forecasting; surprise; goodness of fit 
JEL:  B16 C19 
Date:  2012–08–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:41032&r=ecm 
By:  Cecília Hornok 
Abstract:  Trade economists often estimate gravity equations of international trade with fixed effects. Anderson and van Wincoop (2003, American Economic Review 93, 170–192) have shown the importance of controlling for multilateral trade resistances when estimating a gravity equation. This can be done by including exportertime and importertime fixed effects in a panel or exporter and importer fixed effects in a cross section estimation. I argue that this approach limits the identifiability of policy parameters that capture the effect of certain ”club memberships” (EU, NAFTA, euro area, WTO, etc.) on trade flows. I show that, in the baseline case, only one effect can be identified, which precludes, for example, the estimation of separate effects on the exporter and the importer side. The magnitude, and even the sign, of the estimated club effect are very sensitive to the precise identification assumptions, which are often left unspecified in empirical studies. The underlying problem is that club membership provides some, but very little bilateral variation. When heterogeneous club effects are to be identified, the membership dummies can become perfectly collinear with the fixed effects. Empirical researchers may not be aware of the lack of identification, because standard estimation techniques often permit them to run perfectly collinear regressions. I illustrate the findings with estimating the effect of EU enlargement in 2004 on the trade flows of new and old members. Finally, I discuss potential solutions. 
Date:  2012–05–20 
URL:  http://d.repec.org/n?u=RePEc:ceu:econwp:2012_11&r=ecm 
By:  Xisong Jin; Francisco Nadal De Simone 
Abstract:  The estimation of banks? marginal probabilities of default using structural credit risk models can be enriched incorporating macrofinancial variables readily available to economic agents. By combining Delianedis and Geske?s model with a Generalized Dynamic Factor Model into a dynamic tcopula as a mechanism for obtaining banks? dependence, this paper develops a framework that generates an early warning indicator and robust outofsample forecasts of banks? probabilities of default. The database comprises both a set of Luxembourg banks and the European banking groups to which they belong. The main results of this study are, first, that the common component of the forward probability of banks? defaulting on their longterm debt, conditional on not defaulting on their shortterm debt, contains a significant early warning feature of interest for an operational macroprudential framework driven by economic activity, credit and interbank activity. Second, incorporating the common and the idiosyncratic components of macrofinancial variables improves the analytical features and the outofsample forecasting performance of the framework proposed. 
Keywords:  financial stability, macroprudential policy, credit risk, early warning indicators, default probability, Generalized Dynamic Factor Model, dynamic copulas, GARCH 
JEL:  C30 E44 G1 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:bcl:bclwop:bclwp075&r=ecm 
By:  Cem Çakmakli (Department of Quantitative Economics, University of Amsterdam, The Netherlands) 
Abstract:  This paper proposes the Bayesian semiparametric dynamic NelsonSiegel model, where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Dirichlet process mixture. An efficient and computationally tractable algorithm is implemented to obtain Bayesian inference. The semiparametric structure of the factors enables us to capture various forms of nonnormalities including fat tails, skewness and nonlinear dependence between factors using a unified approach. The potential of the proposed framework is examined using US bond yields data. The results show that the model can identify two different periods with distinct characteristics. While the relatively stable years of late 1980s and 1990s comprise the first period, the second period captures the years of severe recessions including the recessions of 1970s and 1980s and the recent recession of 20079 together with highly volatile periods of Federal Reserve’s monetary policy experiments in the first half of 1980s. Interestingly, results point out a nonlinear dependence structure between the factors contrasting existing evidence. 
Keywords:  Dynamic factor model, Yield curve, NelsonSiegel model, Dirichlet process mixture, Bayesian inference 
JEL:  C14 C33 C38 G12 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:59_12&r=ecm 