Final - DNN - Hands - On - Jupyter Notebook
Final - DNN - Hands - On - Jupyter Notebook
In this handson you will be building a deep neural network using tensorflow for binary classification
The dataset has two features('feature1' and 'feature2') and one target variable
The target variable(named as 'class') maps each record to either 0 or 1
Some of the necessary pacakges required to read file and data visualization has been imported for you
In [1]:
In [2]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.colors
Using pandas read the csv file and assign the resulting dataframe to variable 'data'
In [5]:
The following code extacts features and target variable and assign it to variable X and y respectively
In [6]:
X = data[['feature1', 'feature2']].values
y = data['class'].values
Run the below piece of code to visualize the data in x-y plane.The green and blue dots corresponds to
class 0 and 1 respectively
You can see that the data is not linearly seperable i.e you cannot draw one specific boundary to classify the
data.
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 1/8
15/08/2021 final_DNN_hands_on - Jupyter Notebook
In [7]:
colors=['green','blue']
cmap = matplotlib.colors.ListedColormap(colors)
#Plot the figure
plt.figure()
plt.title('Non-linearly separable classes')
plt.scatter(X[:,0], X[:,1], c=y,
marker= 'o', s=50,cmap=cmap,alpha = 0.5 )
plt.show()
Before diving into deep neural network lets try to classify the data using simple logistic regression.
The code for logistic regression has been written for you.
Run the below cell to build a simple logistic regression model
In [8]:
Run the below cell to define the method to plot the decision boundary.The code for visualization has been
written for you.
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 2/8
15/08/2021 final_DNN_hands_on - Jupyter Notebook
In [9]:
Run the cell below cell to plot the decision boundary perdicted by logistic regression model
In [10]:
plot_decision_boundary(X.T,y,lambda x: lr_model.predict(x))
From the above plot you can say that simple logistic regression poorly perfroms in classifying the data
since the decision boundary is not able to effectively classify the two classes.
Now build a deep neural network to classify the same data.
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 3/8
15/08/2021 final_DNN_hands_on - Jupyter Notebook
In [11]:
In [12]:
print(X_data.shape)
print(y_data.shape)
(2, 1000)
(1, 1000)
Define the layer dimension as an array called 'layer_dims' with one input layer equal to number of features,
two hidden layer with nine nodes each and one final output layer with** one node**.
In [13]:
In [14]:
Define a function named placeholders to return two placeholders one for input data as A_0 and one for output
data as Y.
In [15]:
# def placeholders(num_features):
# ##Start code here
# A_0 =
# Y =
# return A_0,Y
# ##End code
def placeholders(num_features):
A_0 = tf.placeholder(dtype = tf.float64, shape = ([num_features,None]))
Y = tf.placeholder(dtype = tf.float64, shape = ([1,None]))
return A_0,Y
In [16]:
# def initialize_parameters_deep(layer_dims):
# L = len(layer_dims)
# parameters = {}
# for l in range(1,L):
# parameters['W' + str(l)] =
# parameters['b' + str(l)] =
# return paparamerte
# #end code
def initialize_parameters_deep(layer_dims):
L = len(layer_dims)
parameters = {}
print("Initializing parameters \n L : {}".format(L))
for l in range(1,L):
print("l : {}".format(l))
parameters['W' + str(l)] = tf.Variable(initial_value=tf.random_normal([layer_di
parameters['b' + str(l)]=tf.Variable(initial_value=tf.zeros([layer_dims[l],1],d
print(parameters)
return parameters
Define functon named linear_forward_prop() to define forward propagation for a given layer.
parameters: A_prev(output from previous layer), W(weigth matrix of current layer), b(bias vector for current
layer),activation(type of activation to be used for out of current layer)
returns: A(output from the current layer)
Use relu activation for hidden layers and for final output layer return the output unactivated i.e if activation
is sigmoid
In [17]:
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 5/8
15/08/2021 final_DNN_hands_on - Jupyter Notebook
In [19]:
In [22]:
Train the deep neural network with learning rate 0.3 and number of iterations to 10000
Use X_data and Y_data to train the network
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 6/8
15/08/2021 final_DNN_hands_on - Jupyter Notebook
In [24]:
Initializing parameters
L : 4
l : 1
l : 2
l : 3
0.693147836530587
0.31758342985046395
0.2982741435388299
0.29604448543593426
0.285911629791836
0.2842502469948942
0.2855237372886876
0.28466128831025994
0.28256431673330307
0.28219673302662945
Run the cell below to define the method to predict outputof the model for given input and parameters.The code
has been written for you
In [25]:
Run the below cell to plot the decision boundary predicted by the deep nural network
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 7/8
15/08/2021 final_DNN_hands_on - Jupyter Notebook
In [26]:
plot_decision_boundary(X_data,y,lambda x: predict(x.T,parameters))
https://serverswh8pe7r-ws-dev-server-8000.in-dc-5.projects.hackerrank.net/notebooks/final_DNN_hands_on.ipynb 8/8