classDiagram
class SamplerBase {
<<abstract>>
+step(model, iteration) StepResult
+initialize(model)
+has_nuts_diagnostics() bool
}
class MetropolisSampler {
element-wise MH
}
class GradientSamplerBase {
<<abstract>>
owns HMCAdaptationController
#do_gradient_step(model) StepResult
}
class HMCSampler {
fixed-length leapfrog
~~deprecated~~
}
class NUTSSampler {
adaptive tree depth
-do_unconstrained_step()
-do_constrained_step()
}
SamplerBase <|-- MetropolisSampler
SamplerBase <|-- GradientSamplerBase
GradientSamplerBase <|-- HMCSampler
GradientSamplerBase <|-- NUTSSampler
Sampler Hierarchy
The sampler classes in src/mcmc/samplers/ are header-only wrappers that own adaptation state and delegate to the stateless algorithm functions in src/mcmc/algorithms/. The chain runner interacts with samplers through the SamplerBase interface.
Class hierarchy
The mixed MRF combines a NUTS step for the main parameter block (thresholds, discrete interactions, means, cross-type interactions) with an element-wise Metropolis step for the continuous precision elements. The chain runner orchestrates this; there is no combined sampler class.
SamplerBase interface
class SamplerBase {
virtual StepResult step(BaseModel& model, int iteration) = 0;
virtual void initialize(BaseModel& model) {}
virtual bool has_nuts_diagnostics() const { return false; }
};step() performs one MCMC iteration and returns a StepResult containing the accepted parameter vector, acceptance probability, and optional NUTS diagnostics.
initialize() is called once before the first iteration. For gradient-based samplers, it runs the step-size heuristic that finds a reasonable initial step size by iteratively doubling or halving until the acceptance probability is near the target (default 0.8).
MetropolisSampler
The simplest sampler. Each call to step() invokes model.do_one_metropolis_step(iteration), which performs a full sweep of element-wise proposals, and returns the result. Edge selection is handled by the chain runner before the sampler step, not by MetropolisSampler itself.
Proposal standard deviations are tuned during warmup via Robbins-Monro updates (see Warmup Schedule).
GradientSamplerBase
Shared base class for HMC and NUTS. It owns an HMCAdaptationController that manages:
- Dual averaging for step-size adaptation
- Diagonal mass matrix accumulation via Welford’s online algorithm
The step() method in GradientSamplerBase:
- Calls the concrete
do_gradient_step()override (HMC or NUTS) - Passes the acceptance probability to the adaptation controller
- If the mass matrix was just updated (end of a warmup window), reinitializes the step size with the heuristic
After warmup, the adaptation controller freezes and returns the smoothed step size from dual averaging.
HMCSampler (deprecated)
HMCSampler is deprecated. The R-level update_method = "hamiltonian-mc" option still dispatches to this class, but it will be removed in a future release. Use NUTS instead.
Performs fixed-length leapfrog integration. The number of leapfrog steps is set at construction from config.num_leapfrogs. Each call to do_gradient_step() invokes hmc_step() from mcmc/algorithms/hmc.h. For constrained models, RATTLE integration is used (see Constrained Leapfrog (RATTLE)).
NUTSSampler
Uses the No-U-Turn criterion to adapt trajectory length automatically. The maximum tree depth is set from config.max_tree_depth (default 10). Returns NUTS diagnostics (tree depth, divergence flag, energy) with each step. See NUTS Algorithm for the tree-building details.
Factory dispatch
create_sampler() in chain_runner.cpp constructs the right sampler based on the model’s capabilities and the user’s choice:
unique_ptr<SamplerBase> create_sampler(const SamplerConfig& config,
WarmupSchedule& schedule) {
if (config.sampler_type == "nuts") {
return make_unique<NUTSSampler>(config, schedule);
} else if (config.sampler_type == "hmc" ||
config.sampler_type == "hamiltonian-mc") {
return make_unique<HMCSampler>(config, schedule);
} else if (config.sampler_type == "mh" ||
config.sampler_type == "adaptive-metropolis") {
return make_unique<MetropolisSampler>(config, schedule);
}
}The R layer (validate_sampler.R) resolves strings like "hamiltonian-mc" and "adaptive-metropolis" before they reach C++, so in practice the factory receives "nuts", "hmc", or "mh".
All three model types (GGM, OMRF, mixed MRF) default to NUTS. The user can select Metropolis via update_method = "adaptive-metropolis".
Step-size and mass-matrix adaptation
GradientSamplerBase owns an HMCAdaptationController that coordinates dual averaging for \(\epsilon\) and Welford variance estimation for the diagonal mass matrix \(M\). The full adaptation schedule — stage boundaries, doubling windows, blending formula, and dual-averaging parameters — is documented in Warmup Schedule.
MetropolisSampler uses a separate MetropolisAdaptationController that tunes proposal standard deviations via Robbins–Monro updates targeting an acceptance rate of 0.44.
Constrained integration dispatch
When model.has_constraints() returns true, NUTSSampler switches from the unconstrained leapfrog to RATTLE integration:
- Unconstrained path — Uses
get_vectorized_parameters()andlogp_and_gradient()on the active parameter subset. Used by the OMRF, which handles edge selection by adjusting its active parameter dimension instead. - Constrained path — Uses
get_full_position()andlogp_and_gradient_full()on the complete parameter vector. The sampler wraps the model’sproject_positionandproject_momentummethods into lambda callbacks passed tonuts_step().
The RATTLE algorithm itself is documented in Constrained Leapfrog (RATTLE). The model-side projection functions are described in Model Classes — Constraint projection.