Graphs are a powerful way to represent real-world systems. In most practical applications, graphs are not static. Nodes and edges evolve with time. To effectively model such dynamic relationships, we need learning frameworks that consider both graph structure and are dynamic. Temporal Graph Networks (TGNs) are the solution for it.
What are Temporal Graphs
A temporal graph is a graph where nodes and edges can change over time. That means:
- Edges appear or disappear.
- Nodes may be added or removed.
- Features associated with nodes or edges can vary across time.
Each interaction can be represented as a tuple:
(u,v,t,f)
Where :
- u, v are nodes,
- t is the timestamp,
- f represents additional features of the edge or event.
TGN Architecture and Working Principle
The architecture of TGN integrates memory, temporal encoding and message-passing mechanisms to capture both structural and temporal dependencies efficiently.
Temporal Graph Network - Architecture1. Memory Module
Each node in the graph maintains a memory vector that stores its historical information. This memory acts as a representation of the node’s past interactions and helps the model retain long-term dependencies. The memory gets updated over time whenever the node is involved in an event. This component enables TGNs to handle sequential and streaming data effectively. The purpose of Memory Module is to track temporal evolution of each node over time.
2. Embedding Module
The embedding module generates the current representation of a node based on its latest memory and interaction history. One key challenge it addresses is memory staleness which occurs when a node has not been involved in any event for a long period. The purpose of embedded modules to generate up-to-date node embeddings that avoid using stale memory.
3. Message Function
Every time an interaction between two nodes occurs, a message is computed and sent to update the involved node's memories. In the case of an interaction between nodes i and j at time t, two messages are generated(one for each node). These messages contain relevant information such as:
- The features of the event
- The current embeddings of the involved nodes
- The time difference since the last interaction
4. Time Encoding
As interactions are not uniformly spaced in time, TGNs use a time encoding mechanism that quantifies the time gap between events. This allows the model to learn how recent or past interactions affect a node's current state. Time differences are encoded using functions such as sinusoidal or learned transformations.
5. Message Aggregator
In batch processing, it’s possible for a node to receive multiple messages in the same batch (from multiple interactions). To manage this, TGNs use a message aggregation function that combines these messages before updating the memory. Common aggregation strategies include:
- Most Recent: Use only the latest message for each node.
- Mean Aggregation: Average all messages received for a node.
This ensures that message updates remain consistent and efficient.
6. Memory Updater
Once messages are aggregated, the memory of the involved nodes is updated using recurrent architectures like GRUs or other update functions. The memory updater modifies the node’s state to reflect the new information from the current event.
- For interaction events (communication between two users), the memories of both participating nodes are updated.
- For node-wise events (feature updates or new node creation), only the memory of the affected node is modified.
How TGNs Differ from Traditional GNNs
- Dynamic Structure Handling: Traditional GNNs assume a fixed graph structure, TGNs can model graphs that evolve over time with changing nodes and edges.
- Continuous Embedding Updates: TGNs update node embeddings in real-time based on new events or interactions, This enables them to reflect the most recent context.
- Temporal Memory: Each node maintains a memory vector that stores historical information, allowing TGNs to capture long-term dependencies.
- Time-Aware Modeling: TGNs use time encoding to represent the temporal gap between interactions, enabling the model to learn from the timing and sequence of events.
- Streaming and Event-Based Learning: TGNs are optimized for event-driven data and can operate in a streaming setting, unlike static GNNs that require snapshot-based processing.
Applications of Temporal Graph Networks
- Fraud Detection : Financial transactions are time-sensitive. TGNs help detect unusual sequences of transactions or new fraud patterns.
- Social Network Predictions : Model how users interact over time. TGNs can predict who might become influential, or detect community formation.
- Recommendation Systems : User preferences evolve. TGNs track these changes over time, improving recommendation accuracy.
- Cybersecurity : Analyze communication graphs over time to detect intrusions, suspicious access, or botnet behavior.
- Knowledge Graph Updates : Facts and relationships change. TGNs help in timely and context-aware updates of knowledge bases.