Logo image
Posterior Mode Guided Dimension Reduction for Bayesian Model Averaging in Heavy-Tailed Linear Regression
Preprint   Open access

Posterior Mode Guided Dimension Reduction for Bayesian Model Averaging in Heavy-Tailed Linear Regression

Shamriddha De and Joyee Ghosh
ArXiv.org
Cornell University
02/24/2026
DOI: 10.48550/arxiv.2602.20448
url
https://doi.org/10.48550/arxiv.2602.20448View
Preprint (Author's original)This preprint has not been evaluated by subject experts through peer review. Preprints may undergo extensive changes and/or become peer-reviewed journal articles. Open Access

Abstract

For large model spaces, the potential entrapment of Markov chain Monte Carlo (MCMC) based methods with spike-and-slab priors poses significant challenges in posterior computation in regression models. On the other hand, maximum a posteriori (MAP) estimation, which is a more computationally viable alternative, fails to provide uncertainty quantification. To address these problems simultaneously and efficiently, this paper proposes a hybrid method that blends MAP estimation with MCMC-based stochastic search algorithms within a heavy-tailed error framework. Under hyperbolic errors, the current work develops a two-step expectation conditional maximization (ECM) guided MCMC algorithm. In the first step, we conduct an ECM-based posterior maximization and perform variable selection, thereby identifying a reduced model space in a high posterior probability region. In the second step, we execute a Gibbs sampler on the reduced model space for posterior computation. Such a method is expected to improve the efficiency of posterior computation and enhance its inferential richness. Through simulation studies and benchmark real life examples, our proposed method is shown to exhibit several advantages in variable selection and uncertainty quantification over various state-of-the-art methods.
Statistics - Computation Statistics - Methodology

Details

Metrics

1 Record Views
Logo image