DREAM Documentation
Dynamic Recall and Elastic Adaptive Memory — PyTorch implementation of continuous-time RNN with surprise-driven plasticity
DREAM Documentation
Dynamic Recall and Elastic Adaptive Memory — a PyTorch implementation of continuous-time RNN cells with surprise-driven plasticity and liquid time-constants.
What is DREAM?
DREAM is a neural architecture for online adaptation during inference. Unlike static models (LSTM, Transformer), DREAM modulates its plasticity and integration speeds based on prediction error.
Core Components
- Surprise-Driven Plasticity — Hebbian learning modulated by prediction error
- Liquid Time-Constants (LTC) — Adaptive integration speeds
- Fast Weights — Low-rank decomposition for efficient meta-learning
- Sleep Consolidation — Memory stabilization during low-surprise periods
Installation
pip install dreamnnQuick Start
import torch
from dream import DREAMConfig, DREAMCell
config = DREAMConfig(
input_dim=80,
hidden_dim=256,
rank=16,
)
cell = DREAMCell(config)
state = cell.init_state(batch_size=4)
x = torch.randn(4, 100, 80) # (batch, time, features)
output, final_state = cell.forward_sequence(x, return_all=True)Documentation
| Section | Description |
|---|---|
| Getting Started | Installation and basic usage |
| Architecture | How DREAM works |
| API Reference | Complete API documentation |
| Benchmarks | Performance comparison |
| Guides | Tutorials and examples |
Resources
Version: 0.1.2 (March 2026) | License: MIT