0% found this document useful (0 votes)
1 views

Neural Networks

neural networks notes

Uploaded by

mahesh bochare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Neural Networks

neural networks notes

Uploaded by

mahesh bochare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

Neural Networks

Neural networks are a series of algorithms that attempt to recognize underlying


relationships in a set of data through a process that mimics the way the human
brain operates. They are used in various applications, from image and speech
recognition to playing games and making predictions.

Perceptrons
A perceptron is a type of artificial neuron and one of the simplest forms of a
neural network. It takes multiple binary inputs, applies weights to these inputs,
sums them up, and then passes the result through an activation function (typically
a step function) to produce a binary output.

Feed-Forward Neural Networks


A feed-forward neural network is an artificial neural network where connections
between the nodes do not form a cycle. This is different from recurrent neural
networks. The feed-forward model is the simplest form of neural network: data moves
in only one direction, from the input nodes, through the hidden nodes (if any), and
to the output nodes. There are no cycles or loops in the network.

Backpropagation
Backpropagation is a method used in artificial neural networks to calculate the
gradient of the loss function with respect to each weight by the chain rule,
efficiently computing the gradient one layer at a time. It is essential for
training multi-layer neural networks. Backpropagation helps the network learn by
adjusting the weights in the network to minimize the error in predictions.

Example: Fizz Buzz


Fizz Buzz is a simple programming problem often used in coding interviews. The task
is to print numbers from 1 to 100, but for multiples of 3, print "Fizz" instead of
the number, and for multiples of 5, print "Buzz". For numbers which are multiples
of both 3 and 5, print "FizzBuzz".

Here’s how you can use a neural network to solve Fizz Buzz:

Input Representation: Convert each number into a binary format and use these binary
digits as inputs to the network.
Output Representation: The network should have four outputs, each representing
whether to print the number, "Fizz", "Buzz", or "FizzBuzz".
Training Data: Create a dataset with the numbers from 1 to 100, their binary
representations, and the correct outputs.
Network Structure: Use a simple feed-forward neural network with input, hidden, and
output layers.
Training: Use backpropagation to train the network on the training data until it
can accurately predict the correct outputs for the given inputs.
Here’s a simplified example in Python using a neural network library such as
TensorFlow or PyTorch:

python
Copy code
import torch
import torch.nn as nn
import torch.optim as optim

# Convert a number to its binary representation


def binary_encode(num, num_digits):
return [num >> i & 1 for i in range(num_digits)]

# Define the neural network


class FizzBuzzNN(nn.Module):
def __init__(self):
super(FizzBuzzNN, self).__init__()
self.layers = nn.Sequential(
nn.Linear(10, 100),
nn.ReLU(),
nn.Linear(100, 4)
)

def forward(self, x):


return self.layers(x)

# Training data
X_train = torch.tensor([binary_encode(i, 10) for i in range(1, 101)],
dtype=torch.float32)
y_train = torch.tensor([[1, 0, 0, 0] if i % 3 == 0 and i % 5 == 0 else
[0, 1, 0, 0] if i % 3 == 0 else
[0, 0, 1, 0] if i % 5 == 0 else
[0, 0, 0, 1] for i in range(1, 101)], dtype=torch.float32)

# Initialize the model, loss function, and optimizer


model = FizzBuzzNN()
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(model.parameters(), lr=0.01)

# Training loop
for epoch in range(1000):
optimizer.zero_grad()
output = model(X_train)
loss = criterion(output, torch.max(y_train, 1)[1])
loss.backward()
optimizer.step()
if (epoch + 1) % 100 == 0:
print(f'Epoch [{epoch+1}/1000], Loss: {loss.item():.4f}')

# Testing the model


with torch.no_grad():
for i in range(1, 101):
output = model(torch.tensor(binary_encode(i, 10), dtype=torch.float32))
predicted = torch.argmax(output).item()
if predicted == 0:
print('FizzBuzz')
elif predicted == 1:
print('Fizz')
elif predicted == 2:
print('Buzz')
else:
print(i)
In this example, the neural network is trained to recognize the pattern of Fizz
Buzz based on binary input representations of numbers from 1 to 100.

You might also like