Naive Bayes
Naive Bayes models assume that observations have some multivariate distribution given class membership, but the predictor or features composing the observation are independent. This framework can accommodate a complete feature set such that an observation is a set of multinomial counts.
To train a naive Bayes model, use fitcnb in the command-line interface. After training, predict labels or estimate posterior probabilities by passing the model and predictor data to predict.
Apps
| Classification Learner | Train models to classify data using supervised machine learning | 
Blocks
| ClassificationNaiveBayes Predict | Classify observations using naive Bayes model (Since R2023b) | 
Functions
Objects
| ClassificationPartitionedModel | Cross-validated classification model | 
Classes
| ClassificationNaiveBayes | Naive Bayes classification for multiclass classification | 
| CompactClassificationNaiveBayes | Compact naive Bayes classifier for multiclass classification | 
Topics
- Supervised Learning Workflow and AlgorithmsUnderstand the steps for supervised learning and the characteristics of nonparametric classification and regression functions. 
- Parametric ClassificationLearn about parametric classification methods. 
- Naive Bayes ClassificationThe naive Bayes classifier is designed for use when predictors are independent of one another within each class, but it appears to work well in practice even when that independence assumption is not valid. 
- Plot Posterior Classification ProbabilitiesThis example shows how to visualize classification probabilities for the Naive Bayes classification algorithm. 
- ClassificationThis example shows how to perform classification using discriminant analysis, naive Bayes classifiers, and decision trees. 
- Visualize Decision Surfaces of Different ClassifiersThis example shows how to visualize the decision surface for different classification algorithms.