5. K-Nearest Neighbors Classifiers 2025
5. K-Nearest Neighbors Classifiers 2025
The iris object that is returned by load_iris is a Bunch object, which is very similar
to a dictionary. It contains keys and values:
Keys of iris_dataset:
dict_keys(['target_names', 'feature_names', 'DESCR', 'data',
'target'])
Pushparaj, Amrita Univ, Cbe
The value of the key DESCR is a short description of the dataset
print(iris_dataset['DESCR'][:193] + "\n...")
The value of the key target_names is an array of strings, containing the species of flower
print("Target names: {}".format(iris_dataset['target_names']))
Target names: ['setosa' 'versicolor' 'virginica']
print("Feature names: \n{}".format(iris_dataset['feature_names']))
Feature names:
['sepal length (cm)', 'sepal width (cm)', 'petal length (cm)',
'petal width (cm)']
Pushparaj, Amrita Univ, Cbe
print("Type of data: {}".format(type(iris_dataset['data'])))
Type of data: <class 'numpy.ndarray'>
print("Shape of data: {}".format(iris_dataset['data'].shape))
Shape of data: (150, 4)
First five rows of data:
[[ 5.1 3.5 1.4 0.2]
print("First five rows of data:\n{}".format(iris_dataset['data'][:5])) [ 4.9 3. 1.4 0.2]
[ 4.7 3.2 1.3 0.2]
[ 4.6 3.1 1.5 0.2]
[ 5. 3.6 1.4 0.2]]
print("Target:\n{}".format(iris_dataset['target']))
Target:
[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0000000000000111111111111111111111111
1111111111111111111111111122222222222
2222222222222222222222222222222222222
2 2]
To build the model on the training set, we call the fit method of the knn object,
which takes as arguments the NumPy array X_train containing the training data and
the NumPy array y_train of the corresponding training labels
knn.fit(X_train, y_train)
Pushparaj, Amrita Univ, Cbe
Making Predictions
Imagine we found an iris in the wild with a sepal length of 5 cm, a sepal width of 2.9
cm, a petal length of 1 cm, and a petal width of 0.2 cm. What species of iris would
this be?
X_new = np.array([[5, 2.9, 1, 0.2]])
print("X_new.shape: {}".format(X_new.shape))
X_new.shape: (1, 4)