OCI AI
OCI AI
Recurrent Neural Network remembers the past and its decisions are influenced by what
it has learned from the past.
Backpropagation Through Time(BPTT)
Long Short-Term Memory(LSTM)
LSTM is an improved version of the regular RNN which was designed to make it easy to
capture long-term dependencies in sequence data. A regular RNN functions in such a
way that the hidden state activation is influenced by the other local activations nearest
to them, which corresponds to a “short-term memory”, while the network weights are
influenced by the computations that take place over entire long sequences, which
corresponds to “long-term memory”. Hence the RNN was redesigned so that it has an
activation state that can also act as weights and preserve information over long
distances, hence the name “Long Short-Term Memory”.
https://www.theaidream.com/post/introduction-to-rnn-and-lstm
Machine Learning Foundation
Multi class classification
Deep Learning
Deep learning models Sequence Models
CNN Convlution Neural Netwoks
FNN Feedforword Neural network
– Also called Multi Layer Perceptron (MLP)
– Simplest form of neural networks
Recurrent Neural Networks (RNNs) are a type of neural network architecture that
includes feedback connections. These feedback connections allow RNNs to process
sequential data such as time series, natural language, speech, and more.
Many-to-Many RNN
Many-to-Many
Autoencoders
– These are unsupervised learning model used for feature extraction and dimensionality
reduction, and is commonly employed in data compression and anomaly detection
Transformers
– Widely used in natural language processing and have become state of the art model for
machine translation, text generation and language understanding.
ANN for single dimensional data and CNN for doule dimensional data
Generative AI and LLM
Neural Probabilistic Language Models
Transformer LLM
Vanishing Radiant Gradiant
Prompt
Lifecycle
Finetuning
OCI ML services
Trust ethical AI
OCI Generative AI service