0% found this document useful (0 votes)
9 views

Deep_Learning_Notes

The document provides an overview of deep learning concepts, including feed-forward neural networks, gradient descent, and regularization techniques. It covers various architectures such as CNNs and RNNs, as well as applications in computer vision and natural language processing. Additionally, it discusses advanced topics like attention models and analogy reasoning.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Deep_Learning_Notes

The document provides an overview of deep learning concepts, including feed-forward neural networks, gradient descent, and regularization techniques. It covers various architectures such as CNNs and RNNs, as well as applications in computer vision and natural language processing. Additionally, it discusses advanced topics like attention models and analogy reasoning.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Deep Learning Notes - 10 Marks Answers

UNIT - I: Introduction

1. Feed-forward Neural Networks (FNNs):

- Composed of input, hidden, and output layers.

- Data flows in a single direction without cycles.

2. Gradient Descent and Backpropagation:

- Gradient Descent minimizes loss function.

- Backpropagation calculates gradients for each weight.

3. Unit Saturation and Vanishing Gradient Problem:

- Unit Saturation: Activation functions saturate for extreme inputs.

- Vanishing Gradient: Gradients diminish as they propagate backward.

4. ReLU and Heuristics:

- ReLU avoids vanishing gradients.

- Heuristics like Xavier initialization improve training.

5. Regularization:

- L1, L2 methods reduce overfitting.

- Dropout randomly drops neurons to improve generalization.

UNIT - II: Convolutional Neural Networks (CNNs)


1. CNN Architectures:

- Convolution layers extract spatial features.

- Pooling layers downsample data.

- Fully connected layers handle final classification.

2. RNNs, LSTMs, GRUs:

- RNNs process sequential data.

- LSTMs manage long-term dependencies.

- GRUs simplify computations.

3. Deep Unsupervised Learning:

- Autoencoders compress and reconstruct data.

- GANs generate synthetic data.

4. Attention Models:

- Focus on relevant parts of input data for better results.

UNIT - III: Applications of Deep Learning to Computer Vision

1. Image Segmentation:

- Divides images into meaningful parts (e.g., U-Net).

2. Object Detection:

- Identifies and localizes objects in images (e.g., YOLO).

3. Image Captioning:

- Generates text descriptions for images.


4. GANs for Image Generation:

- Synthesizes realistic images from noise.

5. Attention Models in Vision:

- Focus on specific areas in images for better understanding.

UNIT - IV: Applications of Deep Learning to NLP

1. NLP Basics:

- Processes text data using deep learning models.

2. Word Vector Representations:

- Models like Word2Vec (CBOW, Skip-gram) convert words to vectors.

- GloVe uses co-occurrence statistics for embeddings.

3. Evaluation:

- Similarity metrics like cosine similarity validate models.

UNIT - V: Analogy Reasoning

1. Named Entity Recognition (NER):

- Identifies entities like names and locations in text.

2. Opinion Mining:

- Extracts sentiments from text for analysis.


3. Sentence Classification:

- Categorizes sentences (e.g., spam detection).

4. Dialogue Generation:

- LSTMs create conversational responses for chatbots.

You might also like