Modelos de regresión con mixtura de escala Gaussiana bajo regularización bayesiana
Date
2024-09-09
Journal Title
Journal ISSN
Volume Title
Publisher
Pontificia Universidad Católica del Perú
Abstract
La presente tesis busca estudiar las propiedades, estimación y aplicación a dos conjuntos
de datos reales de diversas técnicas de regularización bayesiana sobre un modelo de regresión
lineal múltiple con mixtura de escala Gaussiana, modelo que incluye al de una regresión
logística. Estas técnicas de regresión penalizada bayesiana plantean distribuciones a priori
que realizan la penalización, introduciendo el concepto de esparcidad, el cual se refiere al
hecho de que solo un reducido número de variables tengan valores distintos de cero en sus
coeficientes de regresión; es decir, es una especie de truncamiento de coeficientes llevados a
cero que produce a su vez modelos más manejables e interpretables. De particular interés
en este trabajo, fue la comparación de las técnicas de regularización bajo penalización y
las derivadas de introducir las prioris de Horseshoe y de Horseshoe + a los coeficientes de
regresión del modelo. Mostrando en la presente tesis, de manera explícita, cómo realizar un
muestreo de Gibbs para la estimación de estos modelos, detallando no solo las distribuciones
condicionales completas necesarias; sino también como es posible, mediante el uso del paquete bayesreg de R, optimizar algunas de estas propuestas de muestreo.
This thesis aims to study the properties, estimation and application to two real data sets of various Bayesian regularization techniques on a multiple linear regression model with Gaussian scale mixture, a model that includes a logistic regression. These Bayesian penalized regression techniques pose a priori distributions that perform the penalty, introducing the concept of sparsity, which refers to the fact that only a small number of variables have non-zero values in their regression coefficients; that is, it is a kind of truncation of coefficients taken to zero that in turn produces more manageable and interpretable models. Of particular interest in this work was the comparison of the penalty regularization techniques and those derived from introducing the Horseshoe and Horseshoe + priors to the regression coefficients of the model. In this thesis, we show explicitly how to perform Gibbs sampling for the estimation of these models, detailing not only the complete conditional distributions necessary, but also how it is possible, through the use of the bayesreg package of R, to optimize some of these sampling proposals.
This thesis aims to study the properties, estimation and application to two real data sets of various Bayesian regularization techniques on a multiple linear regression model with Gaussian scale mixture, a model that includes a logistic regression. These Bayesian penalized regression techniques pose a priori distributions that perform the penalty, introducing the concept of sparsity, which refers to the fact that only a small number of variables have non-zero values in their regression coefficients; that is, it is a kind of truncation of coefficients taken to zero that in turn produces more manageable and interpretable models. Of particular interest in this work was the comparison of the penalty regularization techniques and those derived from introducing the Horseshoe and Horseshoe + priors to the regression coefficients of the model. In this thesis, we show explicitly how to perform Gibbs sampling for the estimation of these models, detailing not only the complete conditional distributions necessary, but also how it is possible, through the use of the bayesreg package of R, to optimize some of these sampling proposals.
Description
Keywords
Análisis de regresión, Estadística bayesiana
Citation
Collections
Endorsement
Review
Supplemented By
Referenced By
Creative Commons license
Except where otherwised noted, this item's license is described as info:eu-repo/semantics/openAccess