0% found this document useful (0 votes)
18 views

Neural Networks: From Import

This document describes building and evaluating a neural network model for breast cancer classification. It loads breast cancer data, splits it into training and test sets, scales the features, trains a multi-layer perceptron classifier on the training set, makes predictions on the test set, and evaluates the model's performance using classification metrics. Key steps include preprocessing the data, training the neural network model, making predictions, and calculating accuracy scores to evaluate the model.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Neural Networks: From Import

This document describes building and evaluating a neural network model for breast cancer classification. It loads breast cancer data, splits it into training and test sets, scales the features, trains a multi-layer perceptron classifier on the training set, makes predictions on the test set, and evaluates the model's performance using classification metrics. Key steps include preprocessing the data, training the neural network model, making predictions, and calculating accuracy scores to evaluate the model.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

12/9/2018 Neural Networks

Neural Networks

Data ¶

In [1]: from sklearn.datasets import load_breast_cancer


cancer = load_breast_cancer()

In [2]: cancer.keys()

Out[2]: dict_keys(['data', 'target', 'target_names', 'DESCR', 'feature_names'])

In [3]: # Print full description by running:


# print(cancer['DESCR'])
# 569 data points with 30 features
cancer['data'].shape

Out[3]: (569, 30)

Training Test Split

In [4]: X = cancer['data']
y = cancer['target']

In [5]: from sklearn.model_selection import train_test_split


X_train, X_test, y_train, y_test = train_test_split(X, y)

Data Preprocessing

In [6]: from sklearn.preprocessing import StandardScaler


scaler = StandardScaler()
# Fit only to the training data
scaler.fit(X_train)

Out[6]: StandardScaler(copy=True, with_mean=True, with_std=True)

In [7]: # Now apply the transformations to the data:


X_train = scaler.transform(X_train)
X_test = scaler.transform(X_test)

Training the model

http://localhost:8888/notebooks/Neural%20Networks.ipynb# 1/3
12/9/2018 Neural Networks

In [8]: from sklearn.neural_network import MLPClassifier

In [17]: mlp = MLPClassifier(activation='logistic', hidden_layer_sizes=(30,30,30), learning

In [18]: mlp.fit(X_train,y_train)

Out[18]: MLPClassifier(activation='logistic', alpha=0.0001, batch_size='auto',


beta_1=0.9, beta_2=0.999, early_stopping=False, epsilon=1e-08,
hidden_layer_sizes=(30, 30, 30), learning_rate='constant',
learning_rate_init=0.01, max_iter=200, momentum=0.9,
nesterovs_momentum=True, power_t=0.5, random_state=None,
shuffle=True, solver='adam', tol=0.0001, validation_fraction=0.1,
verbose=False, warm_start=False)

Prediction and Evaluation

In [19]: predictions = mlp.predict(X_test)

In [20]: from sklearn.metrics import classification_report,confusion_matrix


print(confusion_matrix(y_test,predictions))

[[57 2]
[ 1 83]]

In [21]: print(classification_report(y_test,predictions))

precision recall f1-score support

0 0.98 0.97 0.97 59


1 0.98 0.99 0.98 84

avg / total 0.98 0.98 0.98 143

In [27]: mlp.get_params

Out[27]: <bound method BaseEstimator.get_params of MLPClassifier(activation='logistic',


alpha=0.0001, batch_size='auto',
beta_1=0.9, beta_2=0.999, early_stopping=False, epsilon=1e-08,
hidden_layer_sizes=(30, 30, 30), learning_rate='constant',
learning_rate_init=0.01, max_iter=200, momentum=0.9,
nesterovs_momentum=True, power_t=0.5, random_state=None,
shuffle=True, solver='adam', tol=0.0001, validation_fraction=0.1,
verbose=False, warm_start=False)>

http://localhost:8888/notebooks/Neural%20Networks.ipynb# 2/3
12/9/2018 Neural Networks

In [34]: mlp.coefs_[3]

Out[34]: array([[ 0.41135806],


[-0.4673087 ],
[-0.4696764 ],
[-0.4225919 ],
[-0.57467449],
[ 0.30122197],
[ 0.32523784],
[-0.4452042 ],
[-0.21312412],
[-0.26147128],
[-0.23204318],
[-0.30328138],
[ 0.46206363],
[-0.29834238],
[ 0.42559201],
[ 0.23302195],
[ 0.36175589],
[-0.34698752],
[-0.55674996],
[ 0.34739028],
[ 0.26510333],
[-0.3867968 ],
[-0.15564411],
[-0.4628818 ],
[ 0.29297521],
[ 0.35064964],
[ 0.28237494],
[-0.24916079],
[-0.44629753],
[ 0.28090377]])

In [24]: len(mlp.intercepts_[0])

Out[24]: 30

In [ ]:

http://localhost:8888/notebooks/Neural%20Networks.ipynb# 3/3

You might also like