nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒10‒07
28 papers chosen by
Sune Karlsson
Orebro University

  1. Inference for Parameters Defined by Moment Inequalities: A Recommended Moment Selection Procedure By Donald W.K. Andrews; Panle Jia
  2. Efficient Estimation of Semiparametric Conditional Moment Models with Possibly Nonsmooth Residuals By Xiaohong Chen; Demian Pouzo
  3. Fitting vast dimensional time-varying covariance models By Robert Engle; Neil Shephard; Kevin Shepphard
  4. Direct and iterated multistep AR methods for difference stationary processes By Proietti, Tommaso
  6. The Maximum Lq-Likelihood Method: an Application to Extreme Quantile Estimation in Finance By Davide Ferrari; Sandra Paterlini
  7. GARCH-based identification and estimation of triangular systems By Todd Prono
  8. Discrete-Time Stochastic Volatility Models and MCMC-Based Statistical Inference By Nikolaus Hautsch; Yangguoyi Ou
  9. Parametric density estimation by minimizing nonextensive entropy By Davide Ferrari
  10. Structural vector autoregressions: theory of identification and algorithms for inference By Juan F. Rubio-Ramírez; Daniel F.Waggoner; Tao Zha
  11. New Eurocoin: Tracking Economic Growth in Real Time By Mario Forni; Filippo Altissimo; Riccardo Cristadoro; Marco Lippi; Giovanni Veronese.
  12. Evaluating Value-at-Risk models via Quantile regressions By Piazza Gaglianone, Wagner; Linton, Oliver; Renato Lima, Luiz
  13. Global Identification In Nonlinear Semiparametric Models By Ivana Komunjer
  15. Global Identification of the Semiparametric Box-Cox Model By Ivana Komunjer
  16. Efficient Estimation of Missing Data Models Using Moment Conditions and Semiparametric Restrictions By Bryan S. Graham
  17. Estimating Matching Games with Transfers By Jeremy T. Fox
  18. Forecasting Financial Crises and Contagion in Asia using Dynamic Factor Analysis By Andrea Cipollini; George Kapetanios
  19. The Impact of Piped Water Provision on Infant Mortality in Brazil: A Quantile Panel Data Approach By Shanti Gamper-Rabindran; Shakeeb Khan; Christopher Timmins
  20. DSGE model-based forecasting of non-modelled variables By Frank Schorfheide; Keith Sill; Maxym Kryshko
  21. Symmetry and Time Changed Brownian Motions By José Fajardo; Ernesto Mordecki
  22. Combining Canadian Interest-Rate Forecasts By David Jamieson Bolder; Yuliya Romanyuk
  23. Are They Really Rational? Assessing Professional Macro-Economic Forecasts from the G7-Countries By Jonas Dovern; Johannes Weisser
  24. Real-time measurement of business conditions By S. Boragan Aruoba; Francis X. Diebold; Chiara Scotti
  25. Taste Heterogeneity, IIA, and the Similarity Critique By Thomas J. Steenburgh; Andrew Ainslie
  26. Comparing Input- and Output-Oriented Measures of Technical Efficiency to Determine Local Returns to Scale in DEA Models By Subhash C. Ray
  27. Nonlinear Modeling of Target Leverage with Latent Determinant Variables – New Evidence on the Trade-off Theory By Ralf Sabiwalsky
  28. Constructive data mining: modeling argentine broad money demand By Neil R. Ericsson; Steven B. Kamin

  1. By: Donald W.K. Andrews (Cowles Foundation, Yale University); Panle Jia (MIT)
    Abstract: This paper is concerned with tests and confidence intervals for partially-identified parameters that are defined by moment inequalities and equalities. In the literature, different test statistics, critical value methods, and implementation methods (i.e., asymptotic distribution versus the bootstrap) have been proposed. In this paper, we compare a wide variety of these methods. We provide a recommended test statistic, moment selection critical value method, and implementation method. In addition, we provide a data-dependent procedure for choosing the key moment selection tuning parameter and a data-dependent size-correction factor.
    Keywords: Asymptotic size, Asymptotic power, Confidence set, Exact size, Generalized moment selection, Moment inequalities, Partial identification, Refined moment selection, Test
    JEL: C12 C15
    Date: 2008–09
  2. By: Xiaohong Chen (Cowles Foundation, Yale University); Demian Pouzo (Dept. of Economics, New York University)
    Abstract: This paper greatly extends the results of Ai and Chen (2003) on efficient estimation of semiparametric conditional moment models containing unknown parametric components (theta) and unknown functions of endogenous variables (h). We show that (1) the penalized sieve minimum distance (PSMD) estimator (hat{theta},hat{h}) can simultaneously achieve root-n asymptotic normality of hat{theta} and nonparametric optimal convergence rate of hat{h}, allowing for models with possibly nonsmooth residuals and noncompact infinite dimensional parameter spaces; (2) a simple weighted bootstrap procedure consistently estimates the limiting distribution of the PSMD hat{theta}; (3) the semiparametric efficiency bound formula of Ai and Chen (2003) remains valid for conditional models with nonsmooth residuals, and the optimally weighted PSMD estimator achieves the bound; (4) the profiled optimally weighted PSMD criterion is asymptotically chi-square distributed. We illustrate our general theories using a partially linear quantile instrumental variables regression, a Monte Carlo study, and an empirical estimation of the shape-invariant quantile Engel curves with endogenous total expenditure.
    Keywords: Penalized sieve minimum distance, Nonsmooth generalized residuals, Nonlinear nonparametric endogeneity, Weighted bootstrap, Semiparametric efficiency, Confidence region, Partially linear quantile IV regression, Shape-invariant quantile Engel curves
    JEL: C14 C22
    Date: 2008–02
  3. By: Robert Engle; Neil Shephard; Kevin Shepphard
    Abstract: Building models for high dimensional portfolios is important in risk management and asset allocation. Here we propose a novel and fast way of estimating models of time-varying covariances that overcome an undiagnosed incidental parameter problem which has troubled existing methods when applied to hundreds or even thousands of assets. Indeed we can handle the case where the cross-sectional dimension is larger than the time series one. The theory of this new strategy is developed in some detail, allowing formal hypothesis testing to be carried out on these models. Simulations are used to explore the performance of this inference strategy while empirical examples are reported which show the strength of this method. The out of sample hedging performance of various models estimated using this method are compared.
    Keywords: ARCH models; composite likelihood; dynamic conditional correlations; incidental parameters; quasi-likelihood; time-varying covariances.
    JEL: C01 C14 C32
    Date: 2008
  4. By: Proietti, Tommaso
    Abstract: The paper focuses on the comparison of the direct and iterated AR predictors when Xt is a difference stationary process. In particular, it provides some useful results for comparing the efficiency of the two predictors and for extracting the trend from macroeconomic time series using the two methods. The main results are based on an encompassing representation for the two predictors which enables to derive their properties quite easily under a maintained model. The paper provides an analytic expression for the mean square forecast error of the two predictors and derives useful recursive formulae for computing the direct and iterated coefficients. From the empirical standpoint, we propose estimators of the AR coefficients based on the tapered Yule-Walker estimates; we also provide a test of equal forecast accuracy which is very simple to implement and whose critical values can be obtained with the bootstrap method. Since multistep prediction is tightly bound up with the estimation of the long run component in a time series, we turn to the role of the direct method for trend estimation and derive the corresponding multistep Beveridge-Nelson decomposition.
    Keywords: Beveridge-Nelson decomposition; Multistep estimation; Tapered Yule-Walker estimates; Forecast combination.
    JEL: C51 E32 C53 E31 C22
    Date: 2008–10–01
  5. By: Ted Juhl (Department of Economics, The University of Kansas); Zhijie Xiao (Department of Economics, Boston College)
    Abstract: Several widely used tests for a changing mean exhibit nonmonotonic power in ¯nite samples due to \incorrect" estimation of nuisance parameters under the alternative. In this paper, we study the issue of nonmonotonic power in testing for changing mean. We investigate the asymptotic power properties of the tests using a new framework where alternatives are characterized as having \large" changes. The asymptotic analysis provides a theoretical explanation to the power problem. Modi¯ed tests that have monotonic power against a wide range of alternatives of structural change are proposed. Instead of estimat- ing the nuisance parameters based on ordinary least squares residuals, the proposed tests use modi¯ed estimators based on nonparametric regression residuals. It is shown that tests based on the modi¯ed long-run variance estimator provide an improved rate of divergence of the tests under the alternative of a change in mean. Tests for structural breaks based on such an estimator are able to remain consistent while still retaining the same asymptotic distribution under the null hypothesis of constant mean.
    Date: 2008–09
  6. By: Davide Ferrari; Sandra Paterlini
    Abstract: Estimating financial risk is a critical issue for banks and insurance companies. Recently, quantile estimation based on Extreme Value Theory (EVT) has found a successful domain of application in such a context, outperforming other approaches. Given a parametric model provided by EVT, a natural approach is Maximum Likelihood estimation. Although the resulting estimator is asymptotically efficient, often the number of observations available to estimate the parameters of the EVT models is too small in order to make the large sample property trustworthy. In this paper, we study a new estimator of the parameters, the Maximum Lq-Likelihood estimator (MLqE), introduced by Ferrari and Yang (2007). We show that the MLqE can outperform the standard MLE, when estimating tail probabilities and quantiles of the Generalized Extreme Value (GEV) and the Generalized Pareto (GP) distributions. First, we assess the relative efficiency between the the MLqE and the MLE for various sample sizes, using Monte Carlo simulations. Second, we analyze the performance of the MLqE for extreme quantile estimation using real-world financial data. The MLqE is characterized by a distortion parameter q and extends the traditional log-likelihood maximization procedure. When q→1, the new estimator approaches the traditionalMaximum Likelihood Estimator (MLE), recovering its desirable asymptotic properties; when q 6=1 and the sample size is moderate or small, the MLqE successfully trades bias for variance, resulting in an overall gain in terms of accuracy (Mean Squared Error).
    Keywords: Maximum Likelihood, Extreme Value Theory, q-Entropy, Tail-related Risk Measures
    Date: 2007–06
  7. By: Todd Prono
    Abstract: Diagonal GARCH is shown to support identification of the triangular system and is argued as a higher moment analog to traditional exclusion restrictions used for determining suitable instruments. The estimator for this result is ML in the case where a distribution for the GARCH process is known and GMM otherwise. For the GMM estimator, an alternative weighting matrix is proposed.
    Keywords: Time-series analysis
    Date: 2008
  8. By: Nikolaus Hautsch; Yangguoyi Ou
    Abstract: In this paper, we review the most common specifications of discrete-time stochas- tic volatility (SV) models and illustrate the major principles of corresponding Markov Chain Monte Carlo (MCMC) based statistical inference. We provide a hands-on ap- proach which is easily implemented in empirical applications and financial practice and can be straightforwardly extended in various directions. We illustrate empirical results based on different SV specifications using returns on stock indices and foreign exchange rates.
    Keywords: Stochastic Volatility, Markov Chain Monte Carlo, Metropolis-Hastings al- Jump Processes
    JEL: C15 C22 G12
    Date: 2008–09
  9. By: Davide Ferrari
    Abstract: In this paper, we consider parametric density estimation based on minimizing the Havrda-Charvat-Tsallis nonextensive entropy. The resulting estimator, called the Maximum Lq-Likelihood estimator (MLqE), is indexed by a single distortion parameter q, which controls the trade-off between bias and variance. The method has two notable special cases. If q tends to 1, the MLqE is the Maximum Likelihood Estimator (MLE). When q = 1=2, the MLqE is a minimum Hellinger distance type of estimator with the perk of avoiding nonparametric techniques and the difficulties of bandwith selection. The MLqE is studied using asymptotic analysis, simulations and real-world data, showing that it conciliates two apparently contrasting needs: effciency and robustness, conditional to a proper choice of q. When the sample size is small or moderate, the MLqE trades bias for variance, resulting in a reduced mean squared error compared to the MLE. At the same time, the MLqE exhibits strong robustness at expense of a slightly reduced effciency in presence of observations discordant with the assumed model. To compute the MLq estimates, a fast and easy-to-implement algorithm based on a reweighting strategy is also supplied.
    Keywords: Maximum Likelihood, q-Entropy, Robust Estimation
    Date: 2008–05
  10. By: Juan F. Rubio-Ramírez; Daniel F.Waggoner; Tao Zha
    Abstract: Structural vector autoregressions (SVARs) are widely used for policy analysis and to provide stylized facts for dynamic general equilibrium models. Yet there have been no workable rank conditions to ascertain whether an SVAR is globally identified. When identifying restrictions such as long-run restrictions are imposed on impulse responses, there have been no efficient algorithms for small-sample estimation and inference. To fill these important gaps in the literature, this paper makes four contributions. First, we establish general rank conditions for global identification of both overidentified and exactly identified models. Second, we show that these conditions can be checked as a simple matrix-filling exercise and that they apply to a wide class of identifying restrictions, including linear and certain nonlinear restrictions. Third, we establish a very simple rank condition for exactly identified models that amounts to a straightforward counting exercise. Fourth, we develop a number of efficient algorithms for small-sample estimation and inference.
    Date: 2008
  11. By: Mario Forni; Filippo Altissimo; Riccardo Cristadoro; Marco Lippi; Giovanni Veronese.
    Abstract: Removal of short-run dynamics from a stationary time series to isolate the medium to long-run component, can be obtained by a band-pass filter. However, band pass filters are infinite moving averages and can therefore deteriorate at the end of the sample. This is a well-known result in the literature isolating the business cycle in integrated series. We show that the same problem arises with our application to stationary time series. In this paper we develop a method to obtain smoothing of a stationary time series by using only contemporaneous values of a large dataset, so that no end-of-sample deterioration occurs. Our construction is based on a special version of Generalized Principal Components, which is designed to use leading variables in the dataset as proxies for missing future values in the variable of interest. Our method is applied to the construction of New Eurocoin, an indicator of economic activity for the euro area. New Eurocoin is an estimate, in real time, of the medium to long-run component of the euro area GDP growth, which performs equally well within and at the end of the sample. As our dataset is monthly and most of the series are updated with a short delay, we are able to produce a monthly, real-time indicator. An assessment of its performance as an approximation of the medium to long-run GDP growth, both in terms of fitting and turning-point signaling, is provided.
    Keywords: Coincident Indicator, Band-pass Filter, Large-dataset Factor Models, Generalized Principal Components
    JEL: C51 E32 O30
    Date: 2008–05
  12. By: Piazza Gaglianone, Wagner; Linton, Oliver; Renato Lima, Luiz
    Abstract: This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables to do this sacrifices too much information. However, most of the specification tests (also called backtests) avaliable in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not realy solely on binary variable. It is show that the new backtest provides a sufficiant condition to assess the performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theorical findings are corroborated through a monte Carlo simulation and an empirical exercise with daily S&P500 time series.
    Date: 2008–09
  13. By: Ivana Komunjer (University of California - San Diego)
    Abstract: This paper derives primitive conditions for global identification in nonlinear simultaneous equations systems. Identification is semiparametric in the sense tht it is based on a set of unconditional moment restrictions. Our contribution to the literature is twofold. First, we derive a set of unconditional moment restrictions on the observables that are the starting point for identification in nonlinear structural systems even in the presence of multiple equilibria. Second, we provide primitive conditions under which a parameter value that solves those restrictions is unique. We apply our results a nonlinear IV model with multiple equilibria and give sufficient conditions for identifiability of its paramters.
    Keywords: identification, structural systems, multiple equilibria, correspondences, semiparametric models, proper mappings, global homeomorphisms,
    Date: 2008–03–01
  14. By: Caterina Conigliani
    Abstract: We consider the problem of assessing new and existing technologies for their cost-effectiveness in the case where data on both costs and effects are available from a clinical trial, and we address it by means of the cost-effectiveness acceptability curve. The main difficulty in these analyses is that cost data usually exhibit highly skew and heavy-tailed distributions, so that it can be extremely difficult to produce realistic probabilistic models for the underlying population distribution, and in particular to model accurately the tail of the distribution, which is highly influential in estimating the population mean. Here, in order to integrate the uncertainty about the model into the analysis of cost data and into cost-effectiveness analyses, we consider an approach based on Bayesian model averaging in the particular case of weak prior informations about the unknown parameters of the different models involved in the procedure. The main consequence of this assumption is that the marginal densities required by Bayesian model averaging are undetermined. However in accordance with the theory of partial Bayes factors and in particular of fractional Bayes factors, we suggest replacing each marginal density with a ratio of integrals, that can be efficiently computed via Path Sampling. The results in terms of cost-effectiveness are compared with those obtained with a semi-parametric approach that does not require any assumption about the distribution of costs.
    Keywords: Bayesian model averaging, Cost data, Health economics, MCMC, Non-informative priors,
    JEL: C11 C15
    Date: 2008–07
  15. By: Ivana Komunjer (University of California - San Diego)
    Abstract: This paper establishes the identifiability of the parameters of the Box-Cox model under restrictions that do not require the disturbance in the model to be independent of the explanatory variables. The proposed restrictions are semiparametric in nature: they restrict the support of the conditional distribution of the disturbance but do not require the latter to be known.
    Keywords: identification, Box-Cox regression, structure,
    Date: 2008–04–01
  16. By: Bryan S. Graham
    Abstract: This paper shows that the semiparametric efficiency bound for a parameter identified by an unconditional moment restriction with data missing at random (MAR) coincides with that of a particular augmented moment condition problem. The augmented system consists of the inverse probability weighted (IPW) original moment restriction and an additional conditional moment restriction which exhausts all other implications of the MAR assumption. The paper also investigates the value of additional semiparametric restrictions on the conditional expectation function (CEF) of the original moment function given always-observed covariates. In the missing outcome context, for example, such restrictions are implied by a semiparametric model for the outcome CEF given always-observed covariates. The efficiency bound associated with this model is shown to also coincide with that of a particular moment condition problem. Some implications of these results for estimation are briefly discussed.
    JEL: C1 C14 C21
    Date: 2008–10
  17. By: Jeremy T. Fox
    Abstract: Economists wish to use data on matches to learn about the structural primitives that govern sorting. I show how to use equilibrium data on who matches with whom for nonparametric identification and semiparametric estimation of match production functions in many-to-many, two-sided matching games with transferable utility. Inequalities derived from equilibrium necessary conditions underlie a maximum score estimator of match production functions. The inequalities do not require data on transfers, quotas, or production levels. The estimator does not suffer from a computational or data curse of dimensionality in the number of agents in a matching market, as the estimator avoids solving for an equilibrium and estimating first-stage match probabilities. I present an empirical application to automotive suppliers and assemblers.
    JEL: C1 C14 C71 D85 L22 L62
    Date: 2008–10
  18. By: Andrea Cipollini; George Kapetanios
    Abstract: In this paper we use principal components analysis to obtain vulnerability indicators able to predict financial turmoil. Probit modelling through principal components and also stochastic simulation of a Dynamic Factor model are used to produce the corresponding probability forecasts regarding the currency crisis events a®ecting a number of East Asian countries during the 1997-1998 period. The principal components model improves upon a number of competing models, in terms of out-of-sample forecasting performance.
    Keywords: Financial Contagion, Dynamic Factor Model
    JEL: C32 C51 F34
    Date: 2008–03
  19. By: Shanti Gamper-Rabindran; Shakeeb Khan; Christopher Timmins
    Abstract: We examine the impact of piped water on the under-1 infant mortality rate (IMR) in Brazil using a novel econometric procedure for the estimation of quantile treatment effects with panel data. The provision of piped water in Brazil is highly correlated with other observable and unobservable determinants of IMR -- the latter leading to an important source of bias. Instruments for piped water provision are not readily available, and fixed effects to control for time invariant correlated unobservables are invalid in the simple quantile regression framework. Using the quantile panel data procedure in Chen and Khan (2007), our estimates indicate that the provision of piped water reduces infant mortality by significantly more at the higher conditional quantiles of the IMR distribution than at the lower conditional quantiles (except for cases of extreme underdevelopment). These results imply that targeting piped water intervention toward areas in the upper quantiles of the conditional IMR distribution, when accompanied by other basic public health inputs, can achieve significantly greater reductions in infant mortality.
    JEL: H41 I18 Q53 Q56 Q58
    Date: 2008–09
  20. By: Frank Schorfheide; Keith Sill; Maxym Kryshko
    Abstract: This paper develops and illustrates a simple method to generate a DSGE model-based forecast for variables that do not explicitly appear in the model (non-core variables). The authors use auxiliary regressions that resemble measurement equations in a dynamic factor model to link the non-core variables to the state variables of the DSGE model. Predictions for the non-core variables are obtained by applying their measurement equations to DSGE model- generated forecasts of the state variables. Using a medium-scale New Keynesian DSGE model, the authors apply their approach to generate and evaluate recursive forecasts for PCE inflation, core PCE inflation, and the unemployment rate along with predictions for the seven variables that have been used to estimate the DSGE model.
    Date: 2008
  21. By: José Fajardo (IBMEC Business School - Rio de Janeiro); Ernesto Mordecki (Central Bank of Brazil)
    Abstract: In this paper we examine which Brownian Subordination with drift exhibits the symmetry property introduced by Fajardo and Mordecki (2006b). We obtain that when the subordination results in a Lévy process, a necessary and sufficient condition for the symmetry to hold is that drift must be equal to -1/2.
    Keywords: Time Changed, Subordination, Symmetry
    Date: 2008–09–26
  22. By: David Jamieson Bolder; Yuliya Romanyuk
    Abstract: Model risk is a constant danger for financial economists using interest-rate forecasts for the purposes of monetary policy analysis, portfolio allocations, or risk-management decisions. Use of multiple models does not necessarily solve the problem as it greatly increases the work required and still leaves the question "which model forecast should one use?" Simply put, structural shifts or regime changes (not to mention possible model misspecifications) make it difficult for any single model to capture all trends in the data and to dominate all alternative approaches. To address this issue, we examine various techniques for combining or averaging alternative models in the context of forecasting the Canadian term structure of interest rates using both yield and macroeconomic data. Following Bolder and Liu (2007), we study alternative implementations of four empirical term structure models: this includes the Diebold and Li (2003) approach and three associated generalizations. The analysis is performed using more than 400 months of data ranging from January 1973 to July 2007. We examine a number of model-averaging schemes in both frequentist and Bayesian settings, both following the literature in this field (such as de Pooter, Ravazzolo and van Dijk (2007)) in addition to introducing some new combination approaches. The forecasts from individual models and combination schemes are evaluated in a number of ways; preliminary results show that model averaging generally assists in mitigating model risk, and that simple combination schemes tend to outperform their more complex counterparts. Such findings carry significant implications for central-banking analysis: a unified approach towards accounting for model uncertainty can lead to improved forecasts and, consequently, better decisions.
    Keywords: Interest rates; Econometric and statistical methods
    JEL: C11 E43 E47
    Date: 2008
  23. By: Jonas Dovern; Johannes Weisser
    Abstract: In this paper, we use survey data to analyze the rationality of professional macroeconomic forecasts. We analyze both individual forecasts and average forecasts. We provide evidence on the properties of forecasts for all the G7-counties and four different macroeconomic variables. Furthermore, we present a modification to the structural model which is commonly used to model the forecast errors of fixed event forecasts in the literature. Our results confirm that average forecasts should be used with caution, since even if all individual forecasts are rational the hypothesis of rationality is often rejected by the aggregate forecasts. We find that there are not only large differences in the performance of forecasters across countries but also across different macroeconomic variables; in general, forecasts tend to be biased in situations where forecasters have to learn about large structural shocks or gradual changes in the trend of a variable
    Keywords: Evaluating forecasts,Macroeconomic Forecasting,Rationality,Survey Data,Fixed-Event Forecasts
    JEL: C25 E32 E37
    Date: 2008–09
  24. By: S. Boragan Aruoba; Francis X. Diebold; Chiara Scotti
    Abstract: We construct a framework for measuring economic activity at high frequency, potentially in real time. We use a variety of stock and flow data observed at mixed frequencies (including very high frequencies), and we use a dynamic factor model that permits exact filtering. We illustrate the framework in a prototype empirical example and a simulation study calibrated to the example.
    Keywords: Business conditions
    Date: 2008
  25. By: Thomas J. Steenburgh (Harvard Business School, Marketing Unit); Andrew Ainslie (UCLA Anderson, School of Management)
    Abstract: The purpose of this paper is to show that allowing for taste heterogeneity does not address the similarity critique of discrete-choice models. Although IIA may technically be broken in aggregate, the mixed logit model allows neither a given individual nor the population as a whole to behave with perfect substitution when facing perfect substitutes. Thus, the mixed logit model implies that individuals behave inconsistently across choice sets. Estimating the mixed logit on data in which individuals do behave consistently can result in biased parameter estimates, with the individuals' tastes for desirable attributes being systemically undervalued.
    Keywords: Heterogeneity, Mixed Logit, Independence from Irrelevant Alternatives, IIA, Similarity Critique, Ecological Fallacy
    Date: 2008–09
  26. By: Subhash C. Ray (University of Connecticut)
    Abstract: This paper shows how one can infer the nature of local returns to scale at the input- or output-oriented efficient projection of a technically inefficient input-output bundle, when the input- and output-oriented measures of efficiency differ.
    Keywords: Most Productive Scale Size; Convex Technologies, Nonparametric Efficiency Analysis
    JEL: D2 C6
    Date: 2008–09
  27. By: Ralf Sabiwalsky
    Abstract: The trade-off theory on capital structure is tested by modelling the capital structure target as the solution to a maximization problem. This solution maps asset volatility and loss given default to optimal leverage. By applying nonlinear structural equation modelling, these unobservable variables are estimated based on observable indicator variables, and simultaneously, the speed of adjustment towards this leverage target is estimated. Linear specifications of the leverage target suffer from overlap between the predictions of various theories on capital structure about the sign and significance of determinants. In contrast, the framework applied here allows for a direct test: results confirm the trade-off theory for small and medium-sized firms, but not for large firms.
    Keywords: Capital Structure, Nonlinear, Latent Variables, Trade-off Theory
    JEL: G32 G33 C61
    Date: 2008–08
  28. By: Neil R. Ericsson; Steven B. Kamin
    Abstract: This paper assesses the empirical merits of PcGets and Autometrics--two recent algorithms for computer-automated model selection--using them to improve upon Kamin and Ericsson's (1993) model of Argentine broad money demand. The selected model is an economically sensible and statistically satisfactory error correction model, in which cointegration between money, inflation, the interest rate, and exchange rate depreciation depends on the inclusion of a "ratchet" variable that captures irreversible effects of inflation. Short-run dynamics differ markedly from the long run. Algorithmically based model selection complements opportunities for the researcher to contribute value added in the empirical analysis.
    Date: 2008

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.