Deep Learning HA (Blog)-1
Deep Learning HA (Blog)-1
NZ
Deep Learning HA (Blog)-1.docx
Vishwakarma Group of Institutions
Document Details
Submission ID
trn:oid:::3618:96086845 6 Pages
Download Date
File Name
File Size
9.1 MB
Cited Text
Submitted works
Crossref database
0 Missing Citation 0%
Matches that have quotation marks, but no in-text citation
Integrity Flags
0 Integrity Flags for Review
Our system's algorithms look deeply at a document for any inconsistencies that
No suspicious text manipulations found. would set it apart from a normal submission. If we notice something strange, we flag
it for you to review.
0 Missing Citation 0%
Matches that have quotation marks, but no in-text citation
Top Sources
The sources with the highest number of matches within the submission. Overlapping sources will not be displayed.
1 Internet
doaj.org 2%
2 Internet
angelxuanchang.github.io 2%
3 Internet
information-science-engineering.newhorizoncollegeofengineering.in 2%
4 Internet
www.journals.uofd.edu.sd 2%
5 Internet
cloudreachtechnology.com 1%
6 Internet
fastercapital.com 1%
7 Internet
leyao-daily.github.io 1%
8 Internet
romanpub.com 1%
In recent times, neural networks have brought about a significant transformation in the realm
of artificial intelligence. Neural networks play a crucial role in boosting recommendation
systems, voice assistants, self-driving cars, medical diagnostics, driving the advancements in
deep learning. In this blog, we'll explain the basics of neural networks, In a systematic manner.
Artificial Neuron
At the heart of every neural network is artificial neuron Which is inspired by the biological
6 neurons present in the human brain. A neuron receives multiple input signals, performs a
weighted sum, applies an activation function, and generates an output.
Processes the result through an activation function (like ReLU, sigmoid, or tanh).
Mathematical Representation
Where:
xi = input feature
wi = weight for input
b = bias term
f = activation function
∑i=1->n = summation from 1 to n inputs
y = output of the neuron
This is Basically the same as above, but written in more readable, code-style
pseudocode for clarity in presentations, or programming contexts.
"Activation" refers to the function f
"Output" refers to y.
Hidden Layers: Where the magic happens (data transformations and learning).
4 Output Layer: Produces the final result (e.g., a classification label or numeric
prediction).
The scope and organization of concealed layers determine whether a model is considered
shallow (with a limited no. of hidden layers) or deep (with a major no. of hidden layers).
This continues until the final layer produces a forecast. It's similar to a series of mathematical
transformations — each step enhances the data.
1 After passing the input through the neural network, it generates an output. The output is then
8 compared to the actual target, and the error is determined using a loss function, such as mean squared
error or cross entropy.
7 Backpropagation operates in the opposite direction — from the output layer back to the input layer —
adjusting weights to reduce the error.’
Activation Functions
It adds non-linearity to the model, allow it to learn complex patterns.
Common types:
2 ReLU (Rectified Linear Unit): f(z) = max(0,z)
Conclusion
Neural networks, taking inspiration from the human brain, serve as basics tools in the
development of current artificial intelligence. From a basic artificial neuron to complex layered
architectures.
Understanding the basics, how neurons process inputs, how layers stack together, and how
learning happens through forward and backward propagation lays the foundation for
5 Exploring further into advanced topics like convolutional neural networks (CNNs), recurrent
neural networks (RNNs), and transformers.