This vignette introduces glmbayes, a package for fitting Bayesian generalized linear models via efficient envelope-based sampling. The vignette series is organized into five main parts and a set of technical appendices. You will move from basic installation and first models, through linear and generalized linear models, to advanced prior structures, dispersion modeling, and GPU-accelerated computation. The appendices document the underlying simulation methods and implementation details. The envelope sampling methodology builds on the likelihood subgradient framework of (Nygren and Nygren 2006).
These chapters provide a high-level overview of the package, its design philosophy, and the basic workflow for fitting Bayesian linear and generalized linear models.
Chapter 00 - Introduction
Overview of the vignette structure, major modeling capabilities, and how
the different parts fit together.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-00.html
Chapter 01 - Getting Started with glmbayes
Install and load glmbayes, fit your first Bayesian GLM
using glmb(), and interpret posterior summaries (means,
credible intervals) via an interface that mirrors base
glm().
https://knygren.r-universe.dev/articles/glmbayes/Chapter-01.html
This part focuses on Bayesian linear regression under the Gaussian family and identity link. It establishes the foundational ideas in a setting where exact multivariate normal posteriors are available.
Chapter 02 - Estimating Bayesian Linear
Models
Work with the Gaussian identity-link case using lmb(). Draw
from the multivariate normal posterior, compare Bayesian estimates to
classical least squares, and explore shrinkage behavior.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-02.html
Chapter 03 - Tailoring Priors - Leveraging the
Prior_Setup Function
Construct multivariate normal priors via Prior_Setup().
Specify prior mean vectors and covariance matrices, and study how these
hyperparameters influence posterior inference and regularization.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-03.html
Chapter 04 - Reviewing Model Predictions, Deviance
Residuals and Model Statistics
Generate fitted values and posterior predictive draws, compute deviance
residuals, and review model-fit summaries such as deviance-based
measures and related diagnostics.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-04.html
This part presents Bayesian GLMs across the major likelihood families, including Binomial, quasi-Binomial, Poisson, quasi-Poisson, and Gamma models. It emphasizes link functions, log-concavity, and practical posterior interpretation.
Chapter 05 - Foundations of GLMs - Families, Links, and
Log-Concave Likelihoods
Review exponential-family GLMs, canonical and non-canonical link
functions, and the log-concave likelihood property that enables
envelope-based accept-reject sampling.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-05.html
Chapter 06 - Estimating Bayesian Generalized Linear
Models
Move beyond Gaussian models and fit Binomial, Poisson, and Gamma GLMs
with glmb(). See how the envelope engine adapts to each
family and compare posterior summaries under different links.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-06.html
Chapter 07 - Models for the Binomial
Family
Work with logistic and probit regressions. Handle binomial outcomes,
specify informative priors, and interpret posterior distributions for
classification and proportion-type data.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-07.html
Chapter 08 - Models for the Poisson Family
Fit count models with a log link. Explore overdispersion diagnostics,
zero-inflation checks, and the impact of prior choice on rate
parameters.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-08.html
Chapter 09 - Models for the Gamma Family
Model positive continuous outcomes using Gamma regression. Combine
regression and dispersion modeling, and interpret overdispersion in
applications such as insurance claims or reaction-time data.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-09.html
These chapters explore more complex modeling scenarios and computational strategies, including informative priors, unknown dispersion parameters, hierarchical (random effects) models, and GPU-accelerated envelope construction.
Chapter 10 - Informative Priors: Centering and priors
with differential prior weights
Construct more flexible priors by centering on domain-specific values
and assigning variable-specific scales. Examine how differentiated prior
weights influence shrinkage and interpretability.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-10.html
Chapter 11 - Estimating Models with Unknown Dispersion
Parameters
Extend envelope-based methods to models with unknown dispersion (e.g.,
Gamma and quasi-families). Use dedicated dispersion samplers to obtain
joint posterior draws and quantify overdispersion uncertainty.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-11.html
Chapter 12 - Large Models: GPU Acceleration using
OpenCL
Scale Bayesian GLMs to higher-dimensional settings by offloading key
computations to the GPU. Configure OpenCL, tune envelope construction
for large models, and benchmark performance gains.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-12.html
Chapter 13 - Hierarchical Linear Models
Fit hierarchical (random effects) linear models using block Gibbs
sampling with rlmb. Covers dispersion-and-coefficients
sampling (e.g., Dobson plant weight) and the Eight Schools example with
conjugate and non-conjugate priors.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-13.html
Chapter 14 - Hierarchical Generalized Linear
Models
Extend hierarchical modeling to non-Gaussian families. Implements a
two-block Gibbs sampler for Poisson regression with observation-level
random effects using the BikeSharing dataset and
rglmb.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-14.html
The appendices document the mathematical and algorithmic foundations of the samplers used in glmbayes, including likelihood subgradient methods, envelope construction, and accept-reject schemes for both regression and dispersion parameters.
Chapter A01: A detailed overview of the glmbayes
package
Present the mathematical foundations behind each sampler, including
derivations of the posterior, the structure of enveloping functions, and
bounds on expected draws per acceptance.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A01.html
Chapter A02: Overview of Estimation
Procedures
Present the mathematical foundations behind each sampler, including
derivations of the posterior, the structure of enveloping functions, and
bounds on expected draws per acceptance. https://knygren.r-universe.dev/articles/glmbayes/Chapter-A02.html
Chapter A03 - Methods Available in
glmbayes
Summarize the key functions, samplers, and diagnostics implemented in
the package, with a focus on how they relate to the underlying
estimation framework.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A03.html
Chapter A04 - Directional Tail Diagnostics for
Prior-Posterior Disagreement
Document the directional tail diagnostic, its theoretical interpretation
as a Bayesian analogue to t- and F-style evidence, scalar and
multivariate decompositions, and its use in summary output.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A04.html
Chapter A05: Simulation Methods - Likelihood Subgradient
Densities
Detail the likelihood-subgradient approach for non-Gaussian families.
Show how subgradients define tangent envelopes and explain why this
yields valid accept-reject sampling for log-concave likelihoods.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A05.html
Chapter A06 - Accept-Reject Sampling for Dispersion in
Gamma Regression
Describe the specialized accept-reject scheme for dispersion parameters
in Gamma regression, including envelope design, proposal choices, and
efficiency considerations.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A06.html
Chapter A07 - Accept-Reject Sampling for Gaussian
Regression Models with Independent Normal-Gamma Priors
Detail the accept-reject-based approach for Gaussian regression with
independent normal-gamma priors, including the structure of the joint
prior, conditional distributions, and sampler efficiency.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A07.html
Chapter A08 - Overview of Envelope Related
Functions
Provide a central overview of the envelope-related functions.
Consolidate the theoretical foundations, function map, and workflow for
users and developers.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A08.html
Chapter A09 - Parallel Sampling Implementation using
RcppParallel
Describe the parallel sampling implementation, pilot logic, and
interactive safeguards.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A09.html
Chapter A10 - Accelerated EnvelopeBuild Implementation
using OpenCL
Document the OpenCL implementation for accelerating envelope
construction on the GPU.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A10.html
Chapter A11 - Implementation Companion for Independent
Normal-Gamma
Document the implementation workflow for independent Normal-Gamma
sampling, with a deep dive into rIndepNormalGammaReg,
EnvelopeOrchestrator, EnvelopeDispersionBuild,
and the standardized accept-reject sampler.
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A11.html
Chapter A12 - Technical Derivations for Priors Returned
by Prior_Setup()
Full mathematical derivations for Gaussian calibration, Normal–Gamma and
independent Normal–Gamma prior pieces, and the objects returned by
Prior_Setup() (including
compute_gaussian_prior).
https://knygren.r-universe.dev/articles/glmbayes/Chapter-A12.html
Together, these chapters and appendices form a coherent progression: from basic usage and model specification, through applied Bayesian GLMs, to the mathematical and computational details that underlie the envelope-based samplers and GPU-accelerated implementations in glmbayes.