Configuration Guide
Tune DREAM parameters for your specific use case
Configuration Guide
This guide helps you configure DREAM for different use cases and understand parameter effects.
Quick Configuration Presets
For ASR/Speech
from dream import DREAMConfig
config = DREAMConfig(
input_dim=39, # 13 MFCC + 13Δ + 13ΔΔ
hidden_dim=256, # Good capacity
rank=16, # Moderate compression
ltc_tau_sys=10.0, # Standard integration
base_threshold=0.5, # Balanced sensitivity
)Rationale:
- 39 input features match standard MFCC deltas
- 256 hidden units provide sufficient capacity
- Moderate rank balances expressivity and efficiency
For Fast Adaptation
config = DREAMConfig(
input_dim=64,
hidden_dim=128,
rank=8,
base_plasticity=0.2, # Higher learning rate
forgetting_rate=0.005, # Slower forgetting
ltc_surprise_scale=20.0, # More dynamic τ
)Rationale:
- Higher plasticity enables rapid learning
- Slower forgetting preserves important patterns
- Dynamic time constants respond quickly to novelty
For Long-Term Memory
config = DREAMConfig(
input_dim=64,
hidden_dim=512, # Large capacity
rank=32, # More expressivity
ltc_tau_sys=20.0, # Slower integration
forgetting_rate=0.001, # Very slow forgetting
target_norm=3.0, # Larger weights
)Rationale:
- Large hidden dimension stores more information
- Slow integration maintains stable memories
- Very slow forgetting preserves long-term patterns
For Stability
config = DREAMConfig(
input_dim=64,
hidden_dim=256,
rank=16,
time_step=0.05, # Smaller dt
ltc_tau_sys=15.0, # Larger τ
target_norm=1.5, # Smaller weights
error_smoothing=0.05, # More smoothing
)Rationale:
- Smaller time step for numerical stability
- Larger time constant for slow dynamics
- More smoothing reduces noise sensitivity
Parameter Effects Reference
Model Dimensions
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
hidden_dim | More capacity, slower | Less capacity, faster | 64-512 |
rank | More expressivity | More compression | 4-32 |
Guidelines:
- Start with
hidden_dim=256for most tasks - Use
rank=16as a baseline - For compression:
rank=8or lower - For complex tasks:
hidden_dim=512,rank=32
Time Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
time_step | Faster dynamics | Slower dynamics | 0.01-0.2 |
Guidelines:
- Default
time_step=0.1works for most cases - For stability: reduce to
0.05 - For fast response: increase to
0.15
Plasticity Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
base_plasticity | Faster learning | Slower learning | 0.05-0.3 |
forgetting_rate | Faster forgetting | Slower forgetting | 0.001-0.05 |
Guidelines:
- High plasticity for online learning:
0.2-0.3 - Low plasticity for stable tasks:
0.05-0.1 - Slow forgetting for long-term memory:
0.001-0.005 - Fast forgetting for non-stationary data:
0.02-0.05
Surprise Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
base_threshold | Less sensitive | More sensitive | 0.3-0.7 |
entropy_influence | More uncertainty weighting | Less uncertainty weighting | 0.1-0.5 |
surprise_temperature | Smoother gating | Sharper gating | 0.05-0.2 |
Guidelines:
- Low threshold for novelty detection:
0.3 - High threshold for stable processing:
0.7 - Higher temperature for smoother adaptation:
0.15-0.2
Liquid Time-Constant Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
ltc_tau_sys | Slower integration | Faster integration | 5.0-30.0 |
ltc_surprise_scale | More dynamic τ | Less dynamic τ | 5.0-30.0 |
Guidelines:
- Large
ltc_tau_sysfor memory tasks:20.0-30.0 - Small
ltc_tau_sysfor fast response:5.0-10.0 - High scale for strong adaptation:
15.0-25.0 - Disable LTC for standard RNN behavior:
ltc_enabled=False
Smoothing Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
error_smoothing | More smoothing | Less smoothing | 0.01-0.1 |
surprise_smoothing | More smoothing | Less smoothing | 0.01-0.1 |
Guidelines:
- High smoothing for noisy data:
0.05-0.1 - Low smoothing for clean data:
0.01-0.02
Homeostasis Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
target_norm | Larger weights | Smaller weights | 1.0-5.0 |
kappa | Stronger homeostasis | Weaker homeostasis | 0.1-1.0 |
Guidelines:
- Larger norm for complex tasks:
3.0-5.0 - Smaller norm for stability:
1.0-2.0
Sleep Consolidation Parameters
| Parameter | Increase → | Decrease → | Recommended Range |
|---|---|---|---|
sleep_rate | Faster consolidation | Slower consolidation | 0.001-0.01 |
min_surprise_for_sleep | Less frequent consolidation | More frequent consolidation | 0.1-0.5 |
Guidelines:
- Enable for long-running models
- Disable for short training runs
Configuration by Task
Speech Recognition
config = DREAMConfig(
input_dim=39, # MFCC + deltas
hidden_dim=256,
rank=16,
ltc_enabled=True,
ltc_tau_sys=10.0,
base_plasticity=0.1,
forgetting_rate=0.01,
)Time Series Forecasting
config = DREAMConfig(
input_dim=10, # Number of features
hidden_dim=128,
rank=8,
ltc_enabled=True,
ltc_tau_sys=15.0, # Slower integration
base_plasticity=0.15,
forgetting_rate=0.005, # Slow forgetting
)Anomaly Detection
config = DREAMConfig(
input_dim=20,
hidden_dim=128,
rank=16,
ltc_enabled=True,
base_threshold=0.3, # More sensitive
surprise_temperature=0.15,
base_plasticity=0.2, # Fast adaptation
)Language Modeling
config = DREAMConfig(
input_dim=300, # Embedding dimension
hidden_dim=512,
rank=32,
ltc_enabled=True,
ltc_tau_sys=20.0, # Long-term dependencies
forgetting_rate=0.001, # Very slow forgetting
target_norm=3.0,
)Online Learning
config = DREAMConfig(
input_dim=64,
hidden_dim=256,
rank=16,
ltc_enabled=True,
ltc_surprise_scale=20.0, # High adaptivity
base_plasticity=0.2,
forgetting_rate=0.02, # Faster forgetting
base_threshold=0.4,
)Tuning Workflow
Step 1: Start with Defaults
config = DREAMConfig(
input_dim=64,
hidden_dim=256,
rank=16,
)Step 2: Monitor Key Metrics
# During training
print(f"Loss: {loss.item():.4f}")
print(f"Surprise: {state.avg_surprise.mean().item():.4f}")
print(f"U norm: {state.U.norm().item():.4f}")Step 3: Adjust Based on Behavior
| Symptom | Adjustment |
|---|---|
| Loss not decreasing | Increase base_plasticity |
| Too much oscillation | Decrease time_step, increase ltc_tau_sys |
| Forgets too quickly | Decrease forgetting_rate |
| Doesn't adapt | Increase base_plasticity, decrease base_threshold |
| Unstable training | Decrease time_step, decrease target_norm |
Step 4: Fine-tune
Once stable, fine-tune for your specific metric:
- Accuracy: Increase
hidden_dim,rank - Speed: Decrease
hidden_dim,rank - Memory: Decrease
rank, enable sleep consolidation - Adaptation: Increase
base_plasticity,ltc_surprise_scale
Next Steps
- Training Best Practices - Optimize training
- Usage Patterns - Common patterns
- Examples - Real-world examples