bnlearn.com Open in urlscan Pro
176.58.124.98  Public Scan

URL: http://bnlearn.com/
Submission: On March 02 via api from GB — Scanned from GB

Form analysis 0 forms found in the DOM

Text Content

bnlearn - an R package for Bayesian network learning and inference
 * Home Page
 * Documentation
 * Examples
 * Research Notes
 * Bayesian Network Repository


 * About the Author



data & R code

data & R code


bnlearn is an R package for learning the graphical structure of Bayesian
networks, estimate their parameters and perform some useful inference. It was
first released in 2007, it has been under continuous development for more than
10 years (and still going strong). To get started and install the latest
development snapshot type

install.packages("https://www.bnlearn.com/releases/bnlearn_latest.tar.gz", repos = NULL, type = "source")


in your R console. (More detailed installation instructions below.)

Downloads current release on CRAN: 4.7 [ link ] latest snapshot + bugfixes:
4.8-20211225 [ link ]

From the R Studio CRAN Mirror: Research Impact:

bnlearn implements the following constraint-based structure learning algorithms:

 * PC (the stable version);
 * Grow-Shrink (GS);
 * Incremental Association Markov Blanket (IAMB);
 * Fast Incremental Association (Fast-IAMB);
 * Interleaved Incremental Association (Inter-IAMB);
 * Incremental Association with FDR Correction (IAMB-FDR);
 * Max-Min Parents & Children (MMPC);
 * Semi-Interleaved Hiton-PC (SI-HITON-PC);
 * Hybrid Parents & Children (HPC);

the following score-based structure learning algorithms:

 * Hill Climbing (HC);
 * Tabu Search (Tabu);

the following hybrid structure learning algorithms:

 * Max-Min Hill Climbing (MMHC);
 * Hybrid HPC (H2PC);
 * General 2-Phase Restricted Maximization (RSMAX2);

the following local discovery algorithms:

 * Chow-Liu;
 * ARACNE;

and the following Bayesian network classifiers:

 * naive Bayes;
 * Tree-Augmented naive Bayes (TAN).

Discrete (multinomial) and continuous (multivariate normal) data sets are
supported, both for structure and parameter learning. The latter can be
performed using either maximum likelihood or Bayesian estimators.
Each constraint-based algorithm can be used with several conditional
independence tests:

 * categorical data (multinomial distribution):
   * mutual information (parametric, semiparametric and permutation tests);
   * shrinkage-estimator for the mutual information;
   * Pearson's X2 (parametric, semiparametric and permutation tests);
 * ordinal data:
   * Jonckheere-Terpstra (parametric and permutation tests);
 * continuous data (multivariate normal distribution):
   * linear correlation (parametric, semiparametric and permutation tests);
   * Fisher's Z (parametric, semiparametric and permutation tests);
   * mutual information (parametric, semiparametric and permutation tests);
   * shrinkage-estimator for the mutual information;
 * mixed data (conditional Gaussian distribution):
   * mutual information (parametric, semiparametric);

and each score-based algorithm can be used with several score functions:

 * categorical data (multinomial distribution):
   * the multinomial log-likelihood;
   * the Akaike Information Criterion (AIC);
   * the Bayesian Information Criterion (BIC);
   * the multinomial predictive log-likelihood;
   * a score equivalent Dirichlet posterior density (BDe);
   * a sparse Dirichlet posterior density (BDs);
   * a Dirichlet posterior density based on Jeffrey's prior (BDJ);
   * a modified Bayesian Dirichlet for mixtures of interventional and
     observational data;
   * the locally averaged BDe score (BDla);
   * the K2 score;
   * the factorized normalized likelihood score (fNML);
   * the quotient normalized likelihood score (qNML);
 * continuous data (multivariate normal distribution):
   * the multivariate Gaussian log-likelihood;
   * the corresponding Akaike Information Criterion (AIC);
   * the corresponding Bayesian Information Criterion (BIC);
   * the corresponding predictive log-likelihood;
   * a score equivalent Gaussian posterior density (BGe);
 * mixed data (conditional Gaussian distribution):
   * the conditional Gaussian log-likelihood;
   * the corresponding Akaike Information Criterion (AIC);
   * the corresponding Bayesian Information Criterion (BIC);
   * the corresponding predictive log-likelihood.


PACKAGE INSTALLATION

bnlearn is available on CRAN and can be downloaded from its web page in the
Packages section (here). It can be installed with a simple:

install.packages("bnlearn")


Development snapshots, which include bugfixes that will be incorporated in the
CRAN release as well as new features, can be downloaded from the links above or
installed with a simple:

install.packages("http://www.bnlearn.com/releases/bnlearn_latest.tar.gz")


The only suggested packages not hosted on CRAN are graph and Rgraphviz, which
can be installed from BioConductor:

if (!requireNamespace("BiocManager", quietly = TRUE))
  install.packages("BiocManager")
BiocManager::install()
BiocManager::install(c("graph", "Rgraphviz"))


following the instructions present on this and this webpages. Please also note
that the gRain package, while on CRAN, depends on packages that are on
Bioconductor both directly and through the gRbase package, which depends on
RBGL:

BiocManager::install()
BiocManager::install(c("graph", "Rgraphviz", "RBGL"))
install.packages("gRain")


© 2022 Marco Scutari. The contents of this page are licensed under the Creative
Commons Attribution-Share Alike License.