Experiment No. 10 ML
Experiment No. 10 ML
10
Objective:
To implement and compare SVM, Naïve Bayes, and Decision Tree classifiers.
To evaluate their performance using accuracy, precision, recall, and F1-score.
To visualize results using confusion matrices.
Theory:
A supervised learning algorithm that finds the best decision boundary to separate
different classes.
Uses kernels like linear, polynomial, and RBF for better classification.
Libraries Required:
python
CopyEdit
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.svm import SVC
from sklearn.naive_bayes import GaussianNB
from sklearn.tree import DecisionTreeClassifier
from sklearn.metrics import accuracy_score, classification_report, confusion_matrix
from sklearn.datasets import load_wine # Example dataset
Implementation Steps:
python
CopyEdit
# Load a sample dataset (Wine dataset)
data = load_wine()
X, y = data.data, data.target
python
CopyEdit
# Train SVM model
svm_model = SVC(kernel='linear', random_state=42)
svm_model.fit(X_train, y_train)
python
CopyEdit
# Train Naïve Bayes model
nb_model = GaussianNB()
nb_model.fit(X_train, y_train)
python
CopyEdit
# Train Decision Tree model
dt_model = DecisionTreeClassifier(random_state=42)
dt_model.fit(X_train, y_train)
python
CopyEdit
# Plot Confusion Matrices
fig, axes = plt.subplots(1, 3, figsize=(18, 5))
plt.show()