0% found this document useful (0 votes)
5 views

Document

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Document

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

1.

**Define RNN**:

A Recurrent Neural Network (RNN) is a type of neural network designed for sequential
data, where connections between nodes form a directed cycle. This allows the network to
have memory and process data sequences, like text or time series.

2. **Differentiate Recursive Network and Recurrent Network**:

- **Recursive Network**: Applies the same set of weights in a hierarchical structure,


typically for parsing structured data (like trees).

- **Recurrent Network (RNN)**: Processes sequential data with cyclic connections that
allow it to remember previous states.

2. **Important Design Patterns for Recurrent Neural Network**:

Key patterns include Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU),
Encoder-Decoder, and Attention Mechanism, each designed to handle specific challenges
in sequence learning.

3. **Formulate Unfolding Computational Graphs**:

Unfolding a computational graph in RNNs means representing the RNN across time steps
in a linear sequence. Each node represents the network state at a given time step, allowing
for easier application of backpropagation.

4. **Define Image Compression**:

Image compression reduces the size of an image file by removing redundant data,
enabling more efficient storage and transmission. Techniques include lossy (JPEG) and
lossless (PNG) compression.

5. **Explain Bidirectional RNNs**:


Bidirectional RNNs process data in both forward and backward directions, capturing
context from both past and future time steps, which enhances performance in tasks where
full context is valuable, such as language processing.

6. **Explain Backpropagation Through Time in RNN**:

Backpropagation Through Time (BPTT) is an extension of backpropagation for RNNs. It


unfolds the network across time steps, applying gradient descent to update weights for
each time step, enabling the network to learn temporal dependencies.

You might also like