glmbayes

GitHub release (latest by date) License: GPL-3 GitHub Workflow Status

glmbayes provides independent and identically distributed (iid) samples for Bayesian Generalized Linear Models (GLMs). Its primary interface, glmb(), serves as a Bayesian analogue to R’s glm() function, supporting Gaussian, Poisson, Binomial, and Gamma families under log-concave likelihoods. Sampling for most models is performed using accept-reject methods based on likelihood subgradients (Nygren and Nygren, 2006). For Gaussian models, the package also includes lmb(), a Bayesian counterpart to R’s lm().

The package includes a rich set of supporting tools for prior specification, model diagnostics, and method functions that mirror those for lm() and glm(). Most functions are extensively documented, and a comprehensive set of vignettes are available to guide users through the package’s capabilities.

The package is currently available on R-Universe, with plans for a future CRAN submission. For recent updates and planned enhancements, see https://github.com/knygren/glmbayes/blob/main/NEWS.md

Installation

To install the current development version (excluding OpenCL functionality):

install.packages(“glmbayes”, repos = c(“https://cloud.r-project.org”, “https://knygren.r-universe.dev”))

To install a version suitable for large models with GPU acceleration, follow the instructions from

Chapter 12 - Large Models: GPU Acceleration using OpenCL https://knygren.r-universe.dev/articles/glmbayes/Chapter-12.html

Minimal Working Example

library(glmbayes)

# Dobson (1990), p. 93: Randomized Controlled Trial
counts <- c(18,17,15,20,10,20,25,13,12)
outcome <- gl(3,1,9)
treatment <- gl(3,3)
print(d.AD <- data.frame(treatment, outcome, counts))

## Classical glm
glm.D93 <- glm(counts ~ outcome + treatment,
               family = poisson())

## Bayesian glmb
# Step 1: Set up prior
ps <- Prior_Setup(counts ~ outcome + treatment, family = poisson())
mu <- ps$mu
V  <- ps$Sigma

# Step 2: Fit using glmb
glmb.D93 <- glmb(counts ~ outcome + treatment,
                 family = poisson(),
                 pfamily = dNormal(mu = mu, Sigma = V))

summary(glmb.D93)

As with glm(), models are defined by a formula for the linear predictor and a family() describing the likelihood and link. In addition, glmb() requires a pfamily object specifying the prior.

The supported likelihood families, link functions, and compatible pfamilies are:

Likelihood family Link functions Compatible pfamilies
Gaussian identity dNormal, dGamma, dNormal_Gamma, dIndependent_Normal_Gamma
Poisson / Quasi-Poisson log dNormal
Binomial / Quasi-Binomial logit, probit, cloglog dNormal
Gamma log dNormal, dGamma

Prior_Setup

For a default, data‑aligned prior using the same formula and family as glm(), call Prior_Setup(formula, family, data = ..., ...). The returned list includes default settings for the following:

Optional arguments adjust prior weight, centering, and related settings (see the function help and vignette Chapter 03).

Typical Prior_Setup wiring

Assuming ps <- Prior_Setup(...):

The default priors have limiting behaviors that produce estimates resembling classical estimates as priors get weak (see documentation and vignettes for details).

All supported models have log‑concave likelihoods, enabling efficient iid sampling via enveloping functions and subgradient‑based accept–reject algorithms, especially for models lacking standard iid samplers.

Examples and Demos

Use example() and demo() to explore built-in examples and demos for supported families and links:

## Bayesian linear regression
example("lmb")

## Bayesian generalized linear models
example("glmb")

## Predictions for fitted glmb objects (newdata, type, etc.)
example("predict.glmb")

## Deviance residuals and simulate() for posterior predictive checks (menarche)
example("residuals.glmb")

## Two-block Gibbs sampler compared with iid sampling (linear model)
example("rlmb")

## Default prior specification using Prior_Setup
example("Prior_Setup")

## Matrix-input GLM example with an informative prior
example("rglmb")

## Two-step Boston example: estimates and summarizes models with unknown
## dispersion using dGamma priors via rGamma_reg, rglmb, rlmb, glmb, and lmb
example("summary.rGamma_reg")

## High-dimensional Gaussian model (14 predictors) with GPU acceleration (requires OpenCL)
example("Boston_centered")

## High-dimensional binomial model (14 predictors) with GPU acceleration (requires OpenCL)
example("Cleveland")

## Hierarchical linear model (Rubin/Gelman 8-schools) via rlmb
demo("Ex_07_Schools")

## Hierarchical generalized linear model (Poisson BikeSharing) via rglmb
demo("Ex_09_BikeSharingPoisson")

## Detailed simulation pipeline for rNormalGLM models (JASA 2006; Vignette Chapter A05)
example("rNormalGLM_std")

## Detailed simulation pipeline for rIndepNormalGammaReg models (Vignette Chapter A07)
example("rIndepNormalGammaReg_std")

Methodology

For generalized linear models where well known sampling methods are unavailable, sampling follows the framework from Nygren and Nygren (2006), using likelihood subgradients to construct enveloping functions for the posterior distribution. When the posterior is approximately normal, the expected number of draws per acceptance is bounded as per that paper and as discussed in our vignettes. Dispersion can be sampled via rGamma_reg() (standalone) or jointly with coefficients via rNormalGamma_reg() and rindepNormalGamma_reg().

GPU Acceleration Using OpenCL

The implemented algorithms tend to have acceptable performance on CPUs up to around 10-14 dimensions. For larger models, the envelope construction is embarrassingly parallel. To accelerate envelope construction in such cases, the package provides optional GPU acceleration using OpenCL. This requires that users have GPU enabled machines and an OpenCL installation. These features are discussed in more detail in two of our vignettes.

Vignettes

The glmbayes package includes a comprehensive set of vignettes organized into five major parts. These vignettes guide users from introductory material through applied modeling, advanced topics, and the underlying simulation methods that support the package.

Part 1: An Introduction

Overview of the package, its design philosophy, and the basic workflow for fitting Bayesian linear and generalized linear models. It introduces the core functions, model objects, and the structure of the modeling interface.

Part 2: Estimating Bayesian Linear Models

These chapters focus on Bayesian linear regression using the Gaussian family. Topics include model fitting, prior construction, posterior summaries, predictions, and deviance residuals. This part establishes the foundation for understanding the Bayesian GLM framework used throughout the package.

Part 3: Generalized Linear Models

This part presents Bayesian GLMs across the major likelihood families, including binomial, quasi-binomial, Poisson, quasi-Poisson, and Gamma models. It covers model specification, link functions, log-concavity, diagnostics, and interpretation of posterior results.

Part 4: Advanced Topics

These chapters explore more complex modeling scenarios and computational strategies, such as informative priors, two-block Gibbs sampling, hierarchical linear and generalized linear models, models with unknown dispersion parameters, and large-scale model fitting using GPU acceleration using OpenCL.

Part 5: Simulation Methods and Technical Implementation

This part documents the mathematical and algorithmic foundations of the package. Topics include estimation procedures, likelihood subgradient densities, envelope construction, accept-reject sampling, and technical reports on sampler design including implementation aspects for GPU acceleration using OpenCL.

Together, these vignettes form a comprehensive reference that supports users at all levels, from first-time Bayesian GLM users to researchers interested in the mathematical and computational details behind the samplers.

Feature Highlights

Limitations

Future Plans