nep-ecm New Economics Papers
on Econometrics
Issue of 2025–11–24
27 papers chosen by
Sune Karlsson, Örebro universitet


  1. Bayesian Semiparametric Causal Inference: Targeted Doubly Robust Estimation of Treatment Effects By G\"ozde Sert; Abhishek Chakrabortty; Anirban Bhattacharya
  2. Specification tests for regression models with measurement errors By Xiaojun Song; Jichao Yuan
  3. Unbiased Regression-Adjusted Estimation of Average Treatment Effects in Randomized Controlled Trials By Alberto Abadie; Mehrdad Ghadiri; Ali Jadbabaie; Mahyar JafariNodeh
  4. Possibilistic Instrumental Variable Regression By Gregor Steiner; Jeremie Houssineau; Mark F. J. Steel
  5. The moment is here: a generalised class of estimators for fuzzy regression discontinuity designs By Stuart Lane
  6. Leniency Designs: An Operator's Manual By Paul Goldsmith-Pinkham; Peter Hull; Michal Koles\'ar
  7. Robust Cauchy-Based Methods for Predictive Regressions By Rustam Ibragimov; Jihyun Kim; Anton Skrobotov
  8. Multivariate Ordered Discrete Response Models with Lattice Structures By Tatiana Komarova; William Matcham
  9. Testing Inequalities Linear in Nuisance Parameters By Gregory Fletcher Cox; Xiaoxia Shi; Yuya Shimizu
  10. Why Applied Macroeconomists Should Not Use Bayesian Estimation of DSGE Models By Meenagh, David; Minford, Patrick; Xu, Yongdeng
  11. Semi-Supervised Treatment Effect Estimation with Unlabeled Covariates via Generalized Riesz Regression By Masahiro Kato
  12. Using spatial modeling to address covariate measurement error By Susanne M. Schennach; Vincent Starck
  13. Quantile Selection in the Gender Pay Gap By Egshiglen Batbayar; Christoph Breunig; Peter Haan; Boryana Ilieva
  14. Probability Weighting Meets Heavy Tails: An Econometric Framework for Behavioral Asset Pricing By Akash Deep; Svetlozar T. Rachev; Frank J. Fabozzi
  15. Confidence Sets for the Emergence, Collapse, and Recovery Dates of a Bubble By Eiji Kurozumi; Anton Skrobotov
  16. Large Bayesian Tensor Autoregressions By Yaling Qi
  17. Riesz Regression As Direct Density Ratio Estimation By Masahiro Kato
  18. Modelos Empiricos de Pos-Dupla Selecao por LASSO: Discussoes para Estudos do Transporte Aereo By Alessandro V. M. Oliveira
  19. Anonymous voting in a heterogeneous society By Konan Hara; Yuki Ito; Paul S. Koh
  20. Inferential Theory for Pricing Errors with Latent Factors and Firm Characteristics By Jungjun Choi; Ming Yuan
  21. Towards Causal Market Simulators By Dennis Thumm; Luis Ontaneda Mijares
  22. Causal Regime Detection in Energy Markets With Augmented Time Series Structural Causal Models By Dennis Thumm
  23. Heterogeneity in peer effects for binary outcomes By Mathieu Lambotte
  24. Dynamic Spatial Treatment Effect Boundaries: A Continuous Functional Framework from Navier-Stokes By Kikuchi, Tatsuru
  25. A Meta-Analysis of SBC Contingent Valuation: Willingness to Pay Estimates, Determinants of Reliability and Replication of Split-Sample Hypothesis Tests By John C. Whitehead; Tim Haas; Lynne Lewis; Leslie Richardson; Pete Schuhmann
  26. Financial Information Theory By Miquel Noguer i Alonso
  27. Monetary Policy Shocks and Narrative Restrictions: Rules Matter By Efrem Castelnuovo; Giovanni Pellegrino; Laust L. Sarkjar

  1. By: G\"ozde Sert; Abhishek Chakrabortty; Anirban Bhattacharya
    Abstract: We propose a semiparametric Bayesian methodology for estimating the average treatment effect (ATE) within the potential outcomes framework using observational data with high-dimensional nuisance parameters. Our method introduces a Bayesian debiasing procedure that corrects for bias arising from nuisance estimation and employs a targeted modeling strategy based on summary statistics rather than the full data. These summary statistics are identified in a debiased manner, enabling the estimation of nuisance bias via weighted observables and facilitating hierarchical learning of the ATE. By combining debiasing with sample splitting, our approach separates nuisance estimation from inference on the target parameter, reducing sensitivity to nuisance model specification. We establish that, under mild conditions, the marginal posterior for the ATE satisfies a Bernstein-von Mises theorem when both nuisance models are correctly specified and remains consistent and robust when only one is correct, achieving Bayesian double robustness. This ensures asymptotic efficiency and frequentist validity. Extensive simulations confirm the theoretical results, demonstrating accurate point estimation and credible intervals with nominal coverage, even in high-dimensional settings. The proposed framework can also be extended to other causal estimands, and its key principles offer a general foundation for advancing Bayesian semiparametric inference more broadly.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.15904
  2. By: Xiaojun Song; Jichao Yuan
    Abstract: In this paper, we propose new specification tests for regression models with measurement errors in the explanatory variables. Inspired by the integrated conditional moment (ICM) approach, we use a deconvoluted residual-marked empirical process and construct ICM-type test statistics based on it. The issue of measurement errors is addressed by applying a deconvolution kernel estimator in constructing the residuals. We demonstrate that employing an orthogonal projection onto the tangent space of nuisance parameters not only eliminates the parameter estimation effect but also facilitates the simulation of critical values via a computationally simple multiplier bootstrap procedure. It is the first time a multiplier bootstrap has been proposed in the literature of specification testing with measurement errors. We also develop specification tests and the multiplier bootstrap procedure when the measurement error distribution is unknown. The finite-sample performance of the proposed tests for both known and unknown measurement error distributions is evaluated through Monte Carlo simulations, which demonstrate their efficacy.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.04127
  3. By: Alberto Abadie; Mehrdad Ghadiri; Ali Jadbabaie; Mahyar JafariNodeh
    Abstract: This article introduces a leave-one-out regression adjustment estimator (LOORA) for estimating average treatment effects in randomized controlled trials. The method removes the finite-sample bias of conventional regression adjustment and provides exact variance expressions for LOORA versions of the Horvitz-Thompson and difference-in-means estimators under simple and complete random assignment. Ridge regularization limits the influence of high-leverage observations, improving stability and precision in small samples. In large samples, LOORA attains the asymptotic efficiency of regression-adjusted estimator as characterized by Lin (2013, Annals of Applied Statistics), while remaining exactly unbiased. To construct confidence intervals, we rely on asymptotic variance estimates that treat the estimator as a two-step procedure, accounting for both the regression adjustment and the random assignment stages. Two within-subject experimental applications that provide realistic joint distributions of potential outcomes as ground truth show that LOORA eliminates substantial biases and achieves close-to-nominal confidence interval coverage.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03236
  4. By: Gregor Steiner; Jeremie Houssineau; Mark F. J. Steel
    Abstract: Instrumental variable regression is a common approach for causal inference in the presence of unobserved confounding. However, identifying valid instruments is often difficult in practice. In this paper, we propose a novel method based on possibility theory that performs posterior inference on the treatment effect, conditional on a user-specified set of potential violations of the exogeneity assumption. Our method can provide informative results even when only a single, potentially invalid, instrument is available, offering a natural and principled framework for sensitivity analysis. Simulation experiments and a real-data application indicate strong performance of the proposed approach.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.16029
  5. By: Stuart Lane
    Abstract: The standard fuzzy regression discontinuity (FRD) estimator is a ratio of differences of local polynomial estimators. I show that this estimator does not have finite moments of any order in finite samples, regardless of the choice of kernel function, bandwidth, or order of polynomial. This leads to an imprecise estimator with a heavy-tailed sampling distribution, and inaccurate inference with small sample sizes or when the discontinuity in the probability of treatment assignment at the cutoff is small. I present a generalised class of computationally simple FRD estimators, which contains a continuum of estimators with finite moments of all orders in finite samples, and nests both the standard FRD and sharp (SRD) estimators. The class is indexed by a single tuning parameter, and I provide simple values that lead to substantial improvements in median bias, median absolute deviation and root mean squared error. These new estimators remain very stable in small samples, or when the discontinuity in the probability of treatment assignment at the cutoff is small. Simple confidence intervals that have strong coverage and length properties in small samples are also developed. The improvements are seen across a wide range of models and using common bandwidth selection algorithms in extensive Monte Carlo simulations. The improved stability and performance of the estimators and confidence intervals is also demonstrated using data on class size effects on educational attainment.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03424
  6. By: Paul Goldsmith-Pinkham; Peter Hull; Michal Koles\'ar
    Abstract: We develop a step-by-step guide to leniency (a.k.a. judge or examiner instrument) designs, drawing on recent econometric literatures. The unbiased jackknife instrumental variables estimator (UJIVE) is purpose-built for leveraging exogenous leniency variation, avoiding subtle biases even in the presence of many decision-makers or controls. We show how UJIVE can also be used to assess key assumptions underlying leniency designs, including quasi-random assignment and average first-stage monotonicity, and to probe the external validity of treatment effect estimates. We further discuss statistical inference, arguing that non-clustered standard errors are often appropriate. A reanalysis of Farre-Mensa et al. (2020), using quasi-random examiner assignment to estimate the value of patents to startups, illustrates our checklist.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03572
  7. By: Rustam Ibragimov; Jihyun Kim; Anton Skrobotov
    Abstract: This paper develops robust inference methods for predictive regressions that address key challenges posed by endogenously persistent or heavy-tailed regressors, as well as persistent volatility in errors. Building on the Cauchy estimation framework, we propose two novel tests: one based on $t$-statistic group inference and the other employing a hybrid approach that combines Cauchy and OLS estimation. These methods effectively mitigate size distortions that commonly arise in standard inference procedures under endogeneity, near nonstationarity, heavy tails, and persistent volatility. The proposed tests are simple to implement and applicable to both continuous- and discrete-time models. Extensive simulation experiments demonstrate favorable finite-sample performance across a range of realistic settings. An empirical application examines the predictability of excess stock returns using the dividend-price and earnings-price ratios as predictors. The results suggest that the dividend-price ratio possesses predictive power, whereas the earnings-price ratio does not significantly forecast returns.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.09249
  8. By: Tatiana Komarova; William Matcham
    Abstract: We analyze multivariate ordered discrete response models with a lattice structure, modeling decision makers who narrowly bracket choices across multiple dimensions. These models map latent continuous processes into discrete responses using functionally independent decision thresholds. In a semiparametric framework, we model latent processes as sums of covariate indices and unobserved errors, deriving conditions for identifying parameters, thresholds, and the joint cumulative distribution function of errors. For the parametric bivariate probit case, we separately derive identification of regression parameters and thresholds, and the correlation parameter, with the latter requiring additional covariate conditions. We outline estimation approaches for semiparametric and parametric models and present simulations illustrating the performance of estimators for lattice models.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03418
  9. By: Gregory Fletcher Cox; Xiaoxia Shi; Yuya Shimizu
    Abstract: This paper proposes a new test for inequalities that are linear in possibly partially identified nuisance parameters. This type of hypothesis arises in a broad set of problems, including subvector inference for linear unconditional moment (in)equality models, specification testing of such models, and inference for parameters bounded by linear programs. The new test uses a two-step test statistic and a chi-squared critical value with data-dependent degrees of freedom that can be calculated by an elementary formula. Its simple structure and tuning-parameter-free implementation make it attractive for practical use. We establish uniform asymptotic validity of the test, demonstrate its finite-sample size and power in simulations, and illustrate its use in an empirical application that analyzes women's labor supply in response to a welfare policy reform.
    Date: 2025–10
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2510.27633
  10. By: Meenagh, David (Cardiff Business School); Minford, Patrick (Cardiff Business School, Cardiff University); Xu, Yongdeng (Cardiff Business School, Cardiff University)
    Abstract: This paper examines how Bayesian estimation performs in applied macroeconomic DSGE models when prior beliefs are misspecified. Using controlled Monte Carlo experiments on a standard Real Business Cycle model and a New Keynesian model, the authors show that Bayesian procedures can deliver severely biased and misleading parameter estimates, with posteriors pulled toward the researcher’s prior rather than the true data-generating process. In contrast, a classical simulation-based method, Indirect Inference, remains largely unbiased and robust even under substantial model uncertainty. The results imply that heavy reliance on Bayesian estimation can entrench false conclusions about key structural features, such as the degree of nominal rigidity, and thereby mislead policy analysis. The paper argues for greater use of robust estimation and model-validation techniques, such as Indirect Inference, to ensure that DSGE-based policy advice rests on credible empirical evidence.
    Keywords: Bayesian Estimation; DSGE Models; Indirect Inference; Monte Carlo Simulation; Model Misspecification
    JEL: C11 C15 C52 E32
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:cdf:wpaper:2025/22
  11. By: Masahiro Kato
    Abstract: This study investigates treatment effect estimation in the semi-supervised setting, where we can use not only the standard triple of covariates, treatment indicator, and outcome, but also unlabeled auxiliary covariates. For this problem, we develop efficiency bounds and efficient estimators whose asymptotic variance aligns with the efficiency bound. In the analysis, we introduce two different data-generating processes: the one-sample setting and the two-sample setting. The one-sample setting considers the case where we can observe treatment indicators and outcomes for a part of the dataset, which is also called the censoring setting. In contrast, the two-sample setting considers two independent datasets with labeled and unlabeled data, which is also called the case-control setting or the stratified setting. In both settings, we find that by incorporating auxiliary covariates, we can lower the efficiency bound and obtain an estimator with an asymptotic variance smaller than that without such auxiliary covariates.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.08303
  12. By: Susanne M. Schennach; Vincent Starck
    Abstract: We propose a new estimation methodology to address the presence of covariate measurement error by exploiting the availability of spatial data. The approach uses neighboring observations as repeated measurements, after suitably controlling for the random distance between the observations in a way that allows the use of operator diagonalization methods to establish identification. The method is applicable to general nonlinear models with potentially nonclassical errors and does not rely on a priori distributional assumptions regarding any of the variables. The method's implementation combines a sieve semiparametric maximum likelihood with a first-step kernel estimator and simulation methods. The method's effectiveness is illustrated through both controlled simulations and an application to the assessment of the effect of pre-colonial political structure on current economic development in Africa.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03306
  13. By: Egshiglen Batbayar; Christoph Breunig; Peter Haan; Boryana Ilieva
    Abstract: We propose a new approach to estimate selection-corrected quantiles of the gender wage gap. Our method employs instrumental variables that explain variation in the latent variable but, conditional on the latent process, do not directly affect selection. We provide semiparametric identification of the quantile parameters without imposing parametric restrictions on the selection probability, derive the asymptotic distribution of the proposed estimator based on constrained selection probability weighting, and demonstrate how the approach applies to the Roy model of labor supply. Using German administrative data, we analyze the distribution of the gender gap in full-time earnings. We find pronounced positive selection among women at the lower end, especially those with less education, which widens the gender gap in this segment, and strong positive selection among highly educated men at the top, which narrows the gender wage gap at upper quantiles.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.16187
  14. By: Akash Deep; Svetlozar T. Rachev; Frank J. Fabozzi
    Abstract: We develop an econometric framework integrating heavy-tailed Student's $t$ distributions with behavioral probability weighting while preserving infinite divisibility. Using 432{, }752 observations across 86 assets (2004--2024), we demonstrate Student's $t$ specifications outperform Gaussian models in 88.4\% of cases. Bounded probability-weighting transformations preserve mathematical properties required for dynamic pricing. Gaussian models underestimate 99\% Value-at-Risk by 19.7\% versus 3.2\% for our specification. Joint estimation procedures identify tail and behavioral parameters with established asymptotic properties. Results provide robust inference for asset-pricing applications where heavy tails and behavioral distortions coexist.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.16563
  15. By: Eiji Kurozumi; Anton Skrobotov
    Abstract: We propose constructing confidence sets for the emergence, collapse, and recovery dates of a bubble by inverting tests for the location of the break date. We examine both likelihood ratio-type tests and the Elliott-Muller-type (2007) tests for detecting break locations. The limiting distributions of these tests are derived under the null hypothesis, and their asymptotic consistency under the alternative is established. Finite-sample properties are evaluated through Monte Carlo simulations. The results indicate that combining different types of tests effectively controls the empirical coverage rate while maintaining a reasonably small length of the confidence set.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.16172
  16. By: Yaling Qi
    Abstract: The availability of multidimensional economic datasets has grown significantly in recent years. An example is bilateral trade values across goods among countries, comprising three dimensions -- importing countries, exporting countries, and goods -- forming a third-order tensor time series. This paper introduces a general Bayesian tensor autoregressive framework to analyze the dynamics of large, multidimensional time series with a particular focus on international trade across different countries and sectors. Departing from the standard homoscedastic assumption in this literature, we incorporate flexible stochastic volatility into the tensor autoregressive models. The proposed models can capture time-varying volatility due to the COVID-19 pandemic and recent outbreaks of war. To address computational challenges and mitigate overfitting, we develop an efficient sampling method based on low-rank Tucker decomposition and hierarchical shrinkage priors. Additionally, we provide a factor interpretation of the model showing how the Tucker decomposition projects large-dimensional disaggregated trade flows onto global factors.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03097
  17. By: Masahiro Kato
    Abstract: Riesz regression has garnered attention as a tool in debiased machine learning for causal and structural parameter estimation (Chernozhukov et al., 2021). This study shows that Riesz regression is closely related to direct density-ratio estimation (DRE) in important cases, including average treat- ment effect (ATE) estimation. Specifically, the idea and objective in Riesz regression coincide with the one in least-squares importance fitting (LSIF, Kanamori et al., 2009) in direct density-ratio estimation. While Riesz regression is general in the sense that it can be applied to Riesz representer estimation in a wide class of problems, the equivalence with DRE allows us to directly import exist- ing results in specific cases, including convergence-rate analyses, the selection of loss functions via Bregman-divergence minimization, and regularization techniques for flexible models, such as neural networks. Conversely, insights about the Riesz representer in debiased machine learning broaden the applications of direct density-ratio estimation methods. This paper consolidates our prior results in Kato (2025a) and Kato (2025b).
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.04568
  18. By: Alessandro V. M. Oliveira
    Abstract: This paper presents and discusses forms of estimation by regularized regression and model selection using the LASSO method - Least Absolute Shrinkage and Selection Operator. LASSO is recognized as one of the main supervised learning methods applied to high-dimensional econometrics, allowing work with large volumes of data and multiple correlated controls. Conceptual issues related to the consequences of high dimensionality in modern econometrics and the principle of sparsity, which underpins regularization procedures, are addressed. The study examines the main post-double selection and post-regularization models, including variations applied to instrumental variable models. A brief description of the lassopack routine package, its syntaxes, and examples of HD, HDS (High-Dimension Sparse), and IV-HDS models, with combinations involving fixed effects estimators, is also presented. Finally, the potential application of the approach in research focused on air transport is discussed, with emphasis on an empirical study on the operational efficiency of airlines and aircraft fuel consumption.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.09767
  19. By: Konan Hara (Michigan State University); Yuki Ito (Indiana University Bloomington); Paul S. Koh (Yonsei University)
    Abstract: We develop an empirical framework for analyzing dynamic games when the underlying information structure is unknown to the analyst. We introduce Markov correlated equilibrium, a dynamic analog of Bayes correlated equilibrium, and show that its predictions coincide with the Markov perfect equilibrium predictions attainable when players observe richer signals than the analyst assumes. We provide tractable methods for informationally robust estimation, inference, and counterfactual analysis. We illustrate the framework with a dynamic entry game between Starbucks and Dunkin’ in the US and study the role of informational assumptions.
    Keywords: Dynamic games, Markov, correlated equilibrium, information, partial identification
    Date: 2025–10
    URL: https://d.repec.org/n?u=RePEc:yon:wpaper:2025rwp-267
  20. By: Jungjun Choi; Ming Yuan
    Abstract: We study factor models that combine latent factors with firm characteristics and propose a new framework for modeling, estimating, and inferring pricing errors. Following Zhang (2024), our approach decomposes mispricing into two distinct components: inside alpha, explained by firm characteristics but orthogonal to factor exposures, and outside alpha, orthogonal to both factors and characteristics. Our model generalizes those developed recently such as Kelly et al. (2019) and Zhang (2024), resolving issues of orthogonality, basis dependence, and unit sensitivity. Methodologically, we develop estimators grounded in low-rank methods with explicit debiasing, providing closed-form solutions and a rigorous inferential theory that accommodates a growing number of characteristics and relaxes standard assumptions on sample dimensions. Empirically, using U.S. stock returns from 2000-2019, we document strong evidence of both inside and outside alphas, with the former showing industry-level co-movements and the latter reflecting idiosyncratic shocks beyond firm fundamentals. Our framework thus unifies statistical and characteristic-based approaches to factor modeling, offering both theoretical advances and new insights into the structure of pricing errors.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.03076
  21. By: Dennis Thumm; Luis Ontaneda Mijares
    Abstract: Market generators using deep generative models have shown promise for synthetic financial data generation, but existing approaches lack causal reasoning capabilities essential for counterfactual analysis and risk assessment. We propose a Time-series Neural Causal Model VAE (TNCM-VAE) that combines variational autoencoders with structural causal models to generate counterfactual financial time series while preserving both temporal dependencies and causal relationships. Our approach enforces causal constraints through directed acyclic graphs in the decoder architecture and employs the causal Wasserstein distance for training. We validate our method on synthetic autoregressive models inspired by the Ornstein-Uhlenbeck process, demonstrating superior performance in counterfactual probability estimation with L1 distances as low as 0.03-0.10 compared to ground truth. The model enables financial stress testing, scenario analysis, and enhanced backtesting by generating plausible counterfactual market trajectories that respect underlying causal mechanisms.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.04469
  22. By: Dennis Thumm
    Abstract: Energy markets exhibit complex causal relationships between weather patterns, generation technologies, and price formation, with regime changes occurring continuously rather than at discrete break points. Current approaches model electricity prices without explicit causal interpretation or counterfactual reasoning capabilities. We introduce Augmented Time Series Causal Models (ATSCM) for energy markets, extending counterfactual reasoning frameworks to multivariate temporal data with learned causal structure. Our approach models energy systems through interpretable factors (weather, generation mix, demand patterns), rich grid dynamics, and observable market variables. We integrate neural causal discovery to learn time-varying causal graphs without requiring ground truth DAGs. Applied to real-world electricity price data, ATSCM enables novel counterfactual queries such as "What would prices be under different renewable generation scenarios?".
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.04361
  23. By: Mathieu Lambotte
    Abstract: I introduce heterogeneity into the analysis of peer effects that arise from conformity, allowing the strength of the taste for conformity to vary across agents' actions. Using a structural model based on a simultaneous network game with incomplete information, I derive conditions for equilibrium uniqueness and for the identification of heterogeneous peer-effect parameters. I also propose specification tests to determine whether the conformity model or the spillover model is consistent with the observed data in the presence of heterogeneous peer effects. Applying the model to data on smoking and alcohol consumption among secondary school students, I show that assuming a homogeneous preference for conformity leads to biased estimates.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.15891
  24. By: Kikuchi, Tatsuru
    Abstract: I develop a comprehensive theoretical framework for dynamic spatial treatment effect boundaries using continuous functional definitions grounded in Navier-Stokes partial differential equations. Rather than discrete treatment effect estimators, the framework characterizes treatment intensity as a continuous function $\tau(\mathbf{x}, t)$ over space-time, enabling rigorous analysis of propagation dynamics, boundary evolution, and cumulative exposure patterns. Building on exact self-similar solutions expressible through Kummer confluent hypergeometric and modified Bessel functions, I establish that treatment effects follow scaling laws $\tau(d, t) = t^{-\alpha} f(d/t^\beta)$ where exponents characterize diffusion mechanisms. The continuous functional approach yields natural definitions of spatial boundaries $d^*(t)$, boundary velocities $v(t) = \partial d^*/\partial t$, treatment effect gradients $\nabla_d \tau$, and integrated exposure functionals $\int_0^T \tau \, dt$. Empirical validation using 42 million TROPOMI satellite observations of NO$_2$ pollution from U.S. coal-fired power plants demonstrates strong exponential spatial decay ($\kappa_s = 0.004028$ per km, $R^2 = 0.35$) with detectable boundaries at $d^* = 572$ km from major facilities. Monte Carlo simulations confirm superior performance over discrete parametric methods in boundary detection and false positive avoidance (94\% correct rejection rate versus 27\% for parametric methods). The framework successfully diagnoses regional heterogeneity: positive decay parameters within 100 km of coal plants validate the theory, while negative decay parameters beyond 100 km correctly signal when alternative pollution sources dominate. This sign reversal demonstrates the framework's diagnostic capability---it identifies when underlying physical assumptions hold versus when alternative mechanisms dominate. Applications span environmental economics (pollution dispersion fields), banking (spatial credit access functions), and healthcare (hospital accessibility). The continuous functional perspective unifies spatial econometrics with mathematical physics, connecting to recent advances in spatial correlation robust inference \citet{muller2022spatial} and addressing spurious spatial regression concerns \citet{muller2024spatial}.
    Keywords: Dynamic treatment effects, continuous functionals, Navier-Stokes equations, self-similar solutions, spatial boundaries, functional calculus, special functions, satellite remote sensing, spatial econometrics
    JEL: C14 C21 C31 C65 D04
    Date: 2025
    URL: https://d.repec.org/n?u=RePEc:pra:mprapa:126718
  25. By: John C. Whitehead; Tim Haas; Lynne Lewis; Leslie Richardson; Pete Schuhmann
    Abstract: The single binary choice (SBC) question format, commonly used in contingent valuation studies and modeled as a hypothetical referendum, is considered incentive compatible when paired with a coercive payment vehicle and a consequential survey. Despite its dominance in the field, the SBC format yields limited information, which can result in imprecise and unreliable estimates of willingness to pay (WTP). This chapter explores the limitations of SBC using a meta-analysis dataset originally compiled by Lewis, Richardson, and Whitehead (2024) for nonparametric WTP estimation. We extend their work by analyzing parametric WTP estimates and comparing them with nonparametric Turnbull and adjusted Kristršm estimates. Our results show that parametric WTP can differ significantly from the Turnbull nonparametric estimate, and that confidence intervals derived from parametric models are often wider than those from non-paramteric WTP estimates. In a meta-regression, we find that the inefficiency of SBC decreases with data quality. We illustrate the importance of these issues with a replication of directional split-sample tests from the meta-data. Compared to parametric WTP estimates, tests using Turnbull and adjusted Kristršm estimates are more likely to detect statistically significant differences in WTP, underscoring the importance of robustness tests with alternative WTP estimates. Key Words:
    Date: 2025
    URL: https://d.repec.org/n?u=RePEc:apl:wpaper:25-11
  26. By: Miquel Noguer i Alonso
    Abstract: This paper introduces a comprehensive framework for Financial Information Theory by applying information-theoretic concepts such as entropy, Kullback-Leibler divergence, mutual information, normalized mutual information, and transfer entropy to financial time series. We systematically derive these measures with complete mathematical proofs, establish their theoretical properties, and propose practical algorithms for estimation. Using S&P 500 data from 2000 to 2025, we demonstrate empirical usefulness for regime detection, market efficiency testing, and portfolio construction. We show that normalized mutual information (NMI) behaves as a powerful, bounded, and interpretable measure of temporal dependence, highlighting periods of structural change such as the 2008 financial crisis and the COVID-19 shock. Our entropy-adjusted Value at Risk, information-theoretic diversification criterion, and NMI-based market efficiency test provide actionable tools for risk management and asset allocation. We interpret NMI as a quantitative diagnostic of the Efficient Market Hypothesis and demonstrate that information-theoretic methods offer superior regime detection compared to traditional autocorrelation- or volatility-based approaches. All theoretical results include rigorous proofs, and empirical findings are validated across multiple market regimes spanning 25 years of daily returns.
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2511.16339
  27. By: Efrem Castelnuovo; Giovanni Pellegrino; Laust L. Sarkjar
    Abstract: Imposing restrictions on policy rule coefficients in vector autoregressive (VAR) models enhances the identification of monetary policy shocks obtained with sign and narrative restrictions. Monte Carlo simulations and empirical analyses for the United States and the Euro area support this result. For the U.S., adding policy coefficient restrictions yields a larger and more precise short-run output response and more stable Phillips multiplier estimates. Heterogeneity in output responses reflects variation in systematic policy reactions to output. In the Euro area, policy coefficient restrictions sharpen the identification of corporate bond spread responses to monetary policy shocks.
    Keywords: monetary policy shocks, narrative restrictions, policy coefficient restrictions, vector autoregressive models, Monte Carlo simulations, DSGE models
    JEL: C32 E32 E52
    Date: 2025–11
    URL: https://d.repec.org/n?u=RePEc:een:camaaa:2025-62

This nep-ecm issue is ©2025 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.