0% found this document useful (0 votes)
4 views

Dpp

Uploaded by

allmamun556
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Dpp

Uploaded by

allmamun556
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

1. ANN (Artificial Neural Network) 4.

LSTM (Long Short-Term Memory)

● Used for Tabular data, Structured Data


● Application: pattern recognition, classification,
regression etc ● Used for sequential data with long-term
dependencies
● Applications: Speech recognition, time
2. CNN (Convolutional Neural Network)
series prediction etc
● More complex than traditional RNNs

5. GRU (Gated Recurrent Unit)

● Used for Image dataset


● Application: Image classification, object detection,
facial recognition etc

3. RNN (Recurrent Neural Networks)

● Used for sequential data


● Removed Input, Forget gate concept
● Application: Language modeling, speech
recognition
● Simpler than LSTMs
● struggle with very long-term dependencies

● Used for sequential data, time series dataset


● Application: speech recognition, time series
prediction
● Vanishing Gradient Problem
● difficulty in capturing long-term dependencies
6. Encoder & Decoder ● Instead of mapping from input to output,
self-attention occurs within the sentence
itself.
● Application: Language modeling,
document summarization
● Increased computational demands

9. Transformer

● Used for Sequence-to-sequence tasks


● Application: Language translation, image
captioning
● Can be used for asynchronized problems
● Fails for sentence length greater than 30

7. Attention

● Used for sequential data, natural language,


image data
● Application: Machine translation, image
● Used for sequential data classification, language modeling
● Focuses on specific parts of the input sequence ● Computational demands
which are directly related ● Requires extensive computing resources
● Application: Improving long-range dependencies
in sequence tasks
● increase computational complexity

8. Self Attention

You might also like