Skip to content

The bgm function estimates the pseudoposterior distribution of the parameters of a Markov Random Field (MRF) for binary, ordinal, continuous, or mixed (discrete and continuous) variables. Depending on the variable types, the model is an ordinal MRF, a Gaussian graphical model (GGM), or a mixed MRF. Optionally, it performs Bayesian edge selection using spike-and-slab priors to infer the network structure.

Usage

bgm(
  x,
  variable_type = "ordinal",
  baseline_category,
  iter = 2000,
  warmup = 2000,
  interaction_prior = cauchy_prior(scale = 1),
  threshold_prior = beta_prime_prior(alpha = 0.5, beta = 0.5),
  means_prior = normal_prior(scale = 1),
  precision_scale_prior = gamma_prior(shape = 1, rate = 1),
  edge_selection = TRUE,
  edge_prior = bernoulli_prior(0.5),
  na_action = c("listwise", "impute"),
  update_method = c("nuts", "adaptive-metropolis"),
  target_accept,
  nuts_max_depth = 10,
  learn_mass_matrix = TRUE,
  chains = 4,
  cores = parallel::detectCores(),
  display_progress = c("per-chain", "total", "none"),
  seed = NULL,
  standardize = FALSE,
  verbose = getOption("bgms.verbose", TRUE),
  progress_callback = NULL,
  pairwise_scale,
  main_alpha,
  main_beta,
  inclusion_probability,
  beta_bernoulli_alpha,
  beta_bernoulli_beta,
  beta_bernoulli_alpha_between,
  beta_bernoulli_beta_between,
  dirichlet_alpha,
  lambda,
  interaction_scale,
  burnin,
  save,
  threshold_alpha,
  threshold_beta
)

Arguments

x

A data frame or matrix with n rows and p columns. Columns may contain binary, ordinal, or continuous variables (see variable_type). Discrete variables are automatically recoded to non-negative integers (0, 1, ..., m); for regular ordinal variables, unobserved categories are collapsed, while Blume–Capel variables retain all categories. Continuous variables are column-centered internally so that the GGM likelihood is formulated with a zero-mean assumption.

variable_type

Character or character vector. Specifies the type of each variable in x. Allowed values: "ordinal", "blume-capel", or "continuous". A single string applies to all variables. A per-variable vector that mixes discrete ("ordinal" / "blume-capel") and "continuous" types fits a mixed MRF. Binary variables are automatically treated as "ordinal". Default: "ordinal".

baseline_category

Integer or vector. Baseline category used in Blume–Capel variables. Can be a single integer (applied to all) or a vector of length p. Required if at least one variable is of type "blume-capel".

iter

Integer. Number of post–burn-in iterations (per chain). Default: 2e3.

warmup

Integer. Number of warmup iterations before collecting samples. Short warmups trigger progressive warnings (NUTS only); see validate_sampler() for the thresholds. Default: 2e3.

interaction_prior

A prior specification object for pairwise interaction parameters, created by one of the prior constructor functions:

Default: cauchy_prior(scale = 1).

threshold_prior

A prior specification object for threshold (main effect) parameters, created by one of the prior constructor functions:

Default: beta_prime_prior(alpha = 0.5, beta = 0.5).

means_prior

A prior specification object for continuous variable means (mixed MRF models only), created by one of the prior constructor functions:

Only used when the model includes continuous variables. Ignored for pure ordinal or pure continuous (GGM) models. Default: normal_prior(scale = 1).

precision_scale_prior

A prior specification object for the diagonal elements of the precision matrix, created by one of:

Only used for models with continuous variables (GGM and mixed MRF). Ignored for pure ordinal models. Default: gamma_prior(shape = 1, rate = 1).

edge_selection

Logical. Whether to perform Bayesian edge selection. If FALSE, the model estimates all edges. Default: TRUE.

edge_prior

An edge prior specification object, or a character string (deprecated). Specifies the prior for edge inclusion. Preferred: pass an object from one of:

Legacy character strings "Bernoulli", "Beta-Bernoulli", "Stochastic-Block" are still accepted but deprecated. Default: bernoulli_prior(0.5).

na_action

Character. Specifies missing data handling. Either "listwise" (drop rows with missing values) or "impute" (perform single imputation during sampling). Default: "listwise".

update_method

Character. Specifies how the MCMC sampler updates the model parameters:

"adaptive-metropolis"

Componentwise adaptive Metropolis–Hastings with Robbins–Monro proposal adaptation.

"nuts"

The No-U-Turn Sampler with RATTLE constrained integration for Gaussian models with edge selection.

Default: "nuts".

target_accept

Numeric between 0 and 1. Target acceptance rate for the sampler. Defaults are set automatically if not supplied: 0.44 for adaptive Metropolis and 0.80 for NUTS.

nuts_max_depth

Integer. Maximum tree depth in NUTS. Must be positive. Default: 10.

learn_mass_matrix

Logical. If TRUE, adapt a diagonal mass matrix during warmup (NUTS only). If FALSE, use the identity matrix. Default: TRUE.

chains

Integer. Number of parallel chains to run. Default: 4.

cores

Integer. Number of CPU cores for parallel execution. Default: parallel::detectCores().

display_progress

Character. Controls progress reporting during sampling. Options: "per-chain" (separate bar per chain), "total" (single combined bar), or "none" (no progress). Default: "per-chain".

seed

Optional integer. Random seed for reproducibility. Must be a single non-negative integer.

standardize

Logical. If TRUE, the prior scale for each pairwise interaction is adjusted based on the range of response scores. Variables with more response categories have larger score products \(x_i \cdot x_j\), which typically correspond to smaller interaction effects \(\sigma_{ij}\). Without standardization, a fixed prior scale is relatively wide for these smaller effects, resulting in less shrinkage for high-category pairs and more shrinkage for low-category pairs. Standardization scales the prior proportionally to the maximum score product, ensuring equivalent relative shrinkage across all pairs. After internal recoding, regular ordinal variables have scores \(0, 1, \ldots, m\). The adjusted scale for the interaction between variables \(i\) and \(j\) is pairwise_scale * m_i * m_j, so that pairwise_scale itself applies to the unit interval case (binary variables where \(m_i = m_j = 1\)). For Blume-Capel variables with reference category \(b\), scores are centered as \(-b, \ldots, m-b\), and the adjustment uses the maximum absolute product of the score endpoints. For mixed pairs, ordinal variables use raw score endpoints \((0, m)\) and Blume-Capel variables use centered score endpoints \((-b, m-b)\). Default: FALSE.

verbose

Logical. If TRUE, prints informational messages during data processing (e.g., missing data handling, variable recoding). Defaults to getOption("bgms.verbose", TRUE). Set options(bgms.verbose = FALSE) to suppress messages globally.

progress_callback

An optional R function with signature function(completed, total) that is called at regular intervals during sampling, where completed is the number of iterations completed across all chains and total is the total number of iterations. Useful for external front-ends (e.g., JASP) that supply their own progress reporting. When NULL (the default), no callback is invoked.

pairwise_scale

[Deprecated] Double. Scale of the Cauchy prior for pairwise interaction parameters. Use interaction_prior instead. Default: 1.

main_alpha, main_beta

[Deprecated] Double. Shape parameters of the beta-prime prior for threshold parameters. Use threshold_prior instead. Defaults: main_alpha = 0.5 and main_beta = 0.5.

inclusion_probability

[Deprecated] Numeric scalar. Use edge_prior = bernoulli_prior(inclusion_probability) instead. Default: 0.5.

beta_bernoulli_alpha, beta_bernoulli_beta

[Deprecated] Double. Use edge_prior = beta_bernoulli_prior(alpha, beta) instead. Defaults: 1.

beta_bernoulli_alpha_between, beta_bernoulli_beta_between

[Deprecated] Double. Use edge_prior = sbm_prior(alpha_between, beta_between) instead. Defaults: 1.

dirichlet_alpha

[Deprecated] Double. Use edge_prior = sbm_prior(dirichlet_alpha = ...) instead. Default: 1.

lambda

[Deprecated] Double. Use edge_prior = sbm_prior(lambda = ...) instead. Default: 1.

interaction_scale, burnin, save, threshold_alpha, threshold_beta

[Deprecated] Deprecated arguments as of bgms 0.1.6.0. Use pairwise_scale, warmup, main_alpha, and main_beta instead.

Value

A list of class "bgms" with posterior summaries, posterior mean matrices, and access to raw MCMC draws. The object can be passed to print(), summary(), and coef().

Main components include:

  • posterior_summary_main: Data frame with posterior summaries (mean, sd, MCSE, ESS, Rhat) for main-effect parameters. For OMRF models these are category thresholds; for mixed MRF models these are discrete thresholds and continuous means. NULL for GGM models (no main effects).

  • posterior_summary_quadratic: Data frame with posterior summaries for the residual variance parameters (GGM and mixed MRF). NULL for OMRF models.

  • posterior_summary_pairwise: Data frame with posterior summaries for partial association parameters.

  • posterior_summary_indicator: Data frame with posterior summaries for edge inclusion indicators (if edge_selection = TRUE).

  • posterior_mean_main: Posterior mean of main-effect parameters. NULL for GGM models. For OMRF: a matrix (p x max_categories) of category thresholds. For mixed MRF: a list with $discrete (threshold matrix) and $continuous (q x 1 matrix of means).

  • posterior_mean_pairwise: Symmetric matrix of posterior mean partial associations (zero diagonal). For continuous variables these are unstandardized partial correlations; for discrete variables these are half the log adjacent-category odds ratio. Use extract_precision(), extract_partial_correlations(), or extract_log_odds() to convert to interpretable scales.

  • posterior_mean_residual_variance: Named numeric vector of posterior mean residual variances \(1/\Theta_{ii}\). Present for GGM and mixed MRF models; NULL for OMRF.

  • posterior_mean_indicator: Symmetric matrix of posterior mean inclusion probabilities (if edge selection was enabled).

  • Additional summaries returned when edge_prior = "Stochastic-Block". For more details about this prior see Sekulovski et al. (2025) .

    • posterior_summary_pairwise_allocations: Data frame with posterior summaries (mean, sd, MCSE, ESS, Rhat) for the pairwise cluster co-occurrence of the nodes. This serves to indicate whether the estimated posterior allocations,co-clustering matrix and posterior cluster probabilities (see blow) have converged.

    • posterior_coclustering_matrix: a symmetric matrix of pairwise proportions of occurrence of every variable. This matrix can be plotted to visually inspect the estimated number of clusters and visually inspect nodes that tend to switch clusters.

    • posterior_mean_allocations: A vector with the posterior mean of the cluster allocations of the nodes. This is calculated using the method proposed in Dahl (2009) .

    • posterior_mode_allocations: A vector with the posterior mode of the cluster allocations of the nodes.

    • posterior_num_blocks: A data frame with the estimated posterior inclusion probabilities for all the possible number of clusters.

  • raw_samples: A list of raw MCMC draws per chain:

    main

    List of main effect samples.

    pairwise

    List of pairwise effect samples.

    indicator

    List of indicator samples (if edge selection enabled).

    allocations

    List of cluster allocations (if SBM prior used).

    nchains

    Number of chains.

    niter

    Number of post–warmup iterations per chain.

    parameter_names

    Named lists of parameter labels.

  • arguments: A list of function call arguments and metadata (e.g., number of variables, warmup, sampler settings, package version).

The summary() method prints formatted posterior summaries, and coef() extracts posterior mean matrices.

NUTS diagnostics (tree depth, divergences, energy, E-BFMI) are included in fit$nuts_diag if update_method = "nuts".

Details

Depending on the variable types, the model is an ordinal MRF, a Gaussian graphical model (GGM), or a mixed MRF. Both regular ordinal variables and Blume–Capel ordinal variables (with a baseline category) are supported.

Edge selection uses spike-and-slab priors with Bernoulli, Beta-Bernoulli, or Stochastic-Block inclusion priors. Parameters are sampled with NUTS (default) or adaptive Metropolis–Hastings, with a multi-stage warmup schedule. Missing data can be handled via listwise deletion or Gibbs imputation.

For full details on model specification, prior choices, warmup, and output interpretation, see the package website at https://bayesian-graphical-modelling-lab.github.io/bgms-docs/.

References

Dahl DB (2009). “Modal clustering in a class of product partition models.” Bayesian Analysis, 4(2), 243–264. doi:10.1214/09-BA409 .

Sekulovski N, Arena G, Haslbeck JMB, Huth KBS, Friel N, Marsman M (2025). “A Stochastic Block Prior for Clustering in Graphical Models.” Retrieved from https://osf.io/preprints/psyarxiv/29p3m_v1. OSF preprint.

See also

vignette("intro", package = "bgms") for a worked example.

Other model-fitting: bgmCompare()

Examples

# \donttest{
# Run bgm on subset of the Wenchuan dataset
fit = bgm(x = Wenchuan[, 1:5], chains = 2)
#> 7 rows with missing values excluded (n = 355 remaining).
#> To impute missing values instead, use na_action = "impute".
#> Chain 1 (Warmup): ⦗╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 50/4000 (1.2%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 56/4000 (1.4%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 106/8000 (1.3%)
#> Elapsed: 1s | ETA: 1m 14s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 100/4000 (2.5%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 98/4000 (2.5%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 198/8000 (2.5%)
#> Elapsed: 2s | ETA: 1m 18s
#> Chain 1 (Warmup): ⦗━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 250/4000 (6.2%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 259/4000 (6.5%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 509/8000 (6.4%)
#> Elapsed: 2s | ETA: 29s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 400/4000 (10.0%)
#> Chain 2 (Warmup): ⦗━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 446/4000 (11.2%)
#> Total   (Warmup): ⦗━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 846/8000 (10.6%)
#> Elapsed: 3s | ETA: 25s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 600/4000 (15.0%)
#> Chain 2 (Warmup): ⦗━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 647/4000 (16.2%)
#> Total   (Warmup): ⦗━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1247/8000 (15.6%)
#> Elapsed: 4s | ETA: 22s
#> Chain 1 (Warmup): ⦗━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 850/4000 (21.2%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 894/4000 (22.4%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1744/8000 (21.8%)
#> Elapsed: 4s | ETA: 14s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1050/4000 (26.2%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1148/4000 (28.7%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2198/8000 (27.5%)
#> Elapsed: 5s | ETA: 13s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1300/4000 (32.5%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1405/4000 (35.1%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2705/8000 (33.8%)
#> Elapsed: 5s | ETA: 10s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1500/4000 (37.5%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━━━⦘ 1609/4000 (40.2%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3109/8000 (38.9%)
#> Elapsed: 6s | ETA: 9s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1700/4000 (42.5%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━━⦘ 1808/4000 (45.2%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3508/8000 (43.9%)
#> Elapsed: 6s | ETA: 8s
#> Chain 1 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1900/4000 (47.5%)
#> Chain 2 (Warmup): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 1989/4000 (49.7%)
#> Total   (Warmup): ⦗━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━━━⦘ 3889/8000 (48.6%)
#> Elapsed: 7s | ETA: 7s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2100/4000 (52.5%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2196/4000 (54.9%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━━⦘ 4296/8000 (53.7%)
#> Elapsed: 8s | ETA: 7s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━━⦘ 2250/4000 (56.2%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2356/4000 (58.9%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━━⦘ 4606/8000 (57.6%)
#> Elapsed: 8s | ETA: 6s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━━⦘ 2450/4000 (61.3%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2574/4000 (64.3%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━━━⦘ 5024/8000 (62.8%)
#> Elapsed: 9s | ETA: 5s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2600/4000 (65.0%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━━━⦘ 2732/4000 (68.3%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 5332/8000 (66.6%)
#> Elapsed: 9s | ETA: 5s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 2800/4000 (70.0%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━⦘ 2932/4000 (73.3%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 5732/8000 (71.7%)
#> Elapsed: 10s | ETA: 4s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━━⦘ 2950/4000 (73.8%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━⦘ 3109/4000 (77.7%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━━⦘ 6059/8000 (75.7%)
#> Elapsed: 11s | ETA: 4s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━━⦘ 3150/4000 (78.8%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━⦘ 3317/4000 (82.9%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━━━━━⦘ 6467/8000 (80.8%)
#> Elapsed: 11s | ETA: 3s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3300/4000 (82.5%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3481/4000 (87.0%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 6781/8000 (84.8%)
#> Elapsed: 12s | ETA: 2s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3500/4000 (87.5%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3694/4000 (92.3%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 7194/8000 (89.9%)
#> Elapsed: 12s | ETA: 1s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╺━━━⦘ 3650/4000 (91.2%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3853/4000 (96.3%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 7503/8000 (93.8%)
#> Elapsed: 13s | ETA: 1s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3800/4000 (95.0%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 4000/4000 (100.0%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 7800/8000 (97.5%)
#> Elapsed: 13s | ETA: 0s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 3950/4000 (98.8%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 4000/4000 (100.0%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 7950/8000 (99.4%)
#> Elapsed: 14s | ETA: 0s
#> Chain 1 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 4000/4000 (100.0%)
#> Chain 2 (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 4000/4000 (100.0%)
#> Total   (Sampling): ⦗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━⦘ 8000/8000 (100.0%)
#> Elapsed: 14s | ETA: 0s

# Posterior inclusion probabilities
summary(fit)$indicator
#>                      mean       mcse        sd n0->0 n0->1 n1->0 n1->1
#> intrusion-dreams  1.00000         NA 0.0000000     0     0     0  3999
#> intrusion-flash   1.00000         NA 0.0000000     0     0     0  3999
#> intrusion-upset   0.97150 0.01731244 0.1663964   109     5     5  3880
#> intrusion-physior 0.95400 0.03085320 0.2094851   180     4     4  3811
#> dreams-flash      1.00000         NA 0.0000000     0     0     0  3999
#> dreams-upset      0.98675 0.01294881 0.1143435    51     2     2  3944
#> dreams-physior    0.10675 0.01613358 0.3087951  3508    64    64   363
#> flash-upset       0.07625 0.01330206 0.2653977  3643    51    51   254
#> flash-physior     1.00000         NA 0.0000000     0     0     0  3999
#> upset-physior     1.00000         NA 0.0000000     0     0     0  3999
#>                   n_eff_mixt     Rhat
#> intrusion-dreams          NA       NA
#> intrusion-flash           NA       NA
#> intrusion-upset     92.37859 1.320139
#> intrusion-physior   46.10051 1.011732
#> dreams-flash              NA       NA
#> dreams-upset        77.97640 1.303399
#> dreams-physior     366.33564 1.009784
#> flash-upset        398.06752 1.000224
#> flash-physior             NA       NA
#> upset-physior             NA       NA

# Posterior pairwise effects
summary(fit)$pairwise
#>                          mean         mcse         sd     n_eff
#> intrusion-dreams  0.314520792 0.0004834441 0.03270320 4576.0210
#> intrusion-flash   0.168901015 0.0004961538 0.03100379 3904.7830
#> intrusion-upset   0.100397176 0.0020365486 0.03305215  904.7888
#> intrusion-physior 0.096206948 0.0031767595 0.03381719  146.3007
#> dreams-flash      0.249475681 0.0004395738 0.02906248 4371.2096
#> dreams-upset      0.111424355 0.0016126144 0.02952098 1984.7708
#> dreams-physior    0.005482598 0.0009176757 0.01614813  206.4917
#> flash-upset       0.003648929 0.0006440414 0.01286358  328.3563
#> flash-physior     0.153574629 0.0005263108 0.02647693 2530.7585
#> upset-physior     0.354630589 0.0004743545 0.02999977 3999.7220
#>                   n_eff_mixt      Rhat
#> intrusion-dreams          NA 1.0008108
#> intrusion-flash           NA 1.0002999
#> intrusion-upset     263.3965 1.0428876
#> intrusion-physior   113.3199 1.0051827
#> dreams-flash              NA 1.0001651
#> dreams-upset        335.1202 1.0253329
#> dreams-physior      309.6463 1.0385755
#> flash-upset         398.9294 0.9998464
#> flash-physior             NA 1.0003318
#> upset-physior             NA 0.9999041
# }