# MeritBadges/BasicClassifiers

## Contents

## Introduction[edit]

Classifiers are a very popular branch of machine learning, with myriad practical applications

## Subject Matter Expert[edit]

## Requirements[edit]

- Discuss classifiers, including their inputs, outputs
- Describe the strengths and weaknesses of classifiers
- Demonstrate an understanding of Naive Bayes classifiers
- Describe the idea of conditional probability
- Demonstrate the derivation of Bayes's Theorem
- Explain how Bayes's Theorem is applied to create a Bayesian classifier
- Demonstrate the creation of data structures appropriate for Naive Bayesian classification given a small sample dataset

- Demonstrate an understanding of Support Vector Machines
- Discuss the idea of higher-dimensional feature spaces
- Discuss separability and the challenges it poses
- Discuss separating planes, both verbally and graphically
- Explain the idea and motivation of a maximum margin hyperplane
- Discuss the kernel trick

- Demonstrate practical knowledge. The student will provide a training set and a test set. Then, using one of the above techniques, and training only on the training set, they must achieve 80+% accuracy on the test set.

## Resources[edit]

#### Bayes Resources[edit]

- Andrew Moore's Probability for Data Miners slides These are great. They will get you all the probability you need for Naive Bayes, and are very clear/self-contained
- Andrew Moore's Naive Bayes slides A fantastic set of slides, but still missing some details you'd get in the classroom

#### SVM Resources[edit]

- Kernel trick at wikipedia
- SVM - Support Vector Machines a good collection of graphs showing the separating plane concept
- Andrew Moore's SVM Slides Again, Andrew Moore has excellent slides which provide an overview, but they're sometimes hard to follow.