Manifestro Docs

Configuration Guide

Tune DREAM parameters for your specific use case

Configuration Guide

This guide helps you configure DREAM for different use cases and understand parameter effects.

Quick Configuration Presets

For ASR/Speech

from dream import DREAMConfig

config = DREAMConfig(
    input_dim=39,       # 13 MFCC + 13Δ + 13ΔΔ
    hidden_dim=256,     # Good capacity
    rank=16,            # Moderate compression
    ltc_tau_sys=10.0,   # Standard integration
    base_threshold=0.5, # Balanced sensitivity
)

Rationale:

  • 39 input features match standard MFCC deltas
  • 256 hidden units provide sufficient capacity
  • Moderate rank balances expressivity and efficiency

For Fast Adaptation

config = DREAMConfig(
    input_dim=64,
    hidden_dim=128,
    rank=8,
    base_plasticity=0.2,    # Higher learning rate
    forgetting_rate=0.005,  # Slower forgetting
    ltc_surprise_scale=20.0, # More dynamic τ
)

Rationale:

  • Higher plasticity enables rapid learning
  • Slower forgetting preserves important patterns
  • Dynamic time constants respond quickly to novelty

For Long-Term Memory

config = DREAMConfig(
    input_dim=64,
    hidden_dim=512,     # Large capacity
    rank=32,            # More expressivity
    ltc_tau_sys=20.0,   # Slower integration
    forgetting_rate=0.001,  # Very slow forgetting
    target_norm=3.0,    # Larger weights
)

Rationale:

  • Large hidden dimension stores more information
  • Slow integration maintains stable memories
  • Very slow forgetting preserves long-term patterns

For Stability

config = DREAMConfig(
    input_dim=64,
    hidden_dim=256,
    rank=16,
    time_step=0.05,     # Smaller dt
    ltc_tau_sys=15.0,   # Larger τ
    target_norm=1.5,    # Smaller weights
    error_smoothing=0.05,  # More smoothing
)

Rationale:

  • Smaller time step for numerical stability
  • Larger time constant for slow dynamics
  • More smoothing reduces noise sensitivity

Parameter Effects Reference

Model Dimensions

ParameterIncrease →Decrease →Recommended Range
hidden_dimMore capacity, slowerLess capacity, faster64-512
rankMore expressivityMore compression4-32

Guidelines:

  • Start with hidden_dim=256 for most tasks
  • Use rank=16 as a baseline
  • For compression: rank=8 or lower
  • For complex tasks: hidden_dim=512, rank=32

Time Parameters

ParameterIncrease →Decrease →Recommended Range
time_stepFaster dynamicsSlower dynamics0.01-0.2

Guidelines:

  • Default time_step=0.1 works for most cases
  • For stability: reduce to 0.05
  • For fast response: increase to 0.15

Plasticity Parameters

ParameterIncrease →Decrease →Recommended Range
base_plasticityFaster learningSlower learning0.05-0.3
forgetting_rateFaster forgettingSlower forgetting0.001-0.05

Guidelines:

  • High plasticity for online learning: 0.2-0.3
  • Low plasticity for stable tasks: 0.05-0.1
  • Slow forgetting for long-term memory: 0.001-0.005
  • Fast forgetting for non-stationary data: 0.02-0.05

Surprise Parameters

ParameterIncrease →Decrease →Recommended Range
base_thresholdLess sensitiveMore sensitive0.3-0.7
entropy_influenceMore uncertainty weightingLess uncertainty weighting0.1-0.5
surprise_temperatureSmoother gatingSharper gating0.05-0.2

Guidelines:

  • Low threshold for novelty detection: 0.3
  • High threshold for stable processing: 0.7
  • Higher temperature for smoother adaptation: 0.15-0.2

Liquid Time-Constant Parameters

ParameterIncrease →Decrease →Recommended Range
ltc_tau_sysSlower integrationFaster integration5.0-30.0
ltc_surprise_scaleMore dynamic τLess dynamic τ5.0-30.0

Guidelines:

  • Large ltc_tau_sys for memory tasks: 20.0-30.0
  • Small ltc_tau_sys for fast response: 5.0-10.0
  • High scale for strong adaptation: 15.0-25.0
  • Disable LTC for standard RNN behavior: ltc_enabled=False

Smoothing Parameters

ParameterIncrease →Decrease →Recommended Range
error_smoothingMore smoothingLess smoothing0.01-0.1
surprise_smoothingMore smoothingLess smoothing0.01-0.1

Guidelines:

  • High smoothing for noisy data: 0.05-0.1
  • Low smoothing for clean data: 0.01-0.02

Homeostasis Parameters

ParameterIncrease →Decrease →Recommended Range
target_normLarger weightsSmaller weights1.0-5.0
kappaStronger homeostasisWeaker homeostasis0.1-1.0

Guidelines:

  • Larger norm for complex tasks: 3.0-5.0
  • Smaller norm for stability: 1.0-2.0

Sleep Consolidation Parameters

ParameterIncrease →Decrease →Recommended Range
sleep_rateFaster consolidationSlower consolidation0.001-0.01
min_surprise_for_sleepLess frequent consolidationMore frequent consolidation0.1-0.5

Guidelines:

  • Enable for long-running models
  • Disable for short training runs

Configuration by Task

Speech Recognition

config = DREAMConfig(
    input_dim=39,           # MFCC + deltas
    hidden_dim=256,
    rank=16,
    ltc_enabled=True,
    ltc_tau_sys=10.0,
    base_plasticity=0.1,
    forgetting_rate=0.01,
)

Time Series Forecasting

config = DREAMConfig(
    input_dim=10,           # Number of features
    hidden_dim=128,
    rank=8,
    ltc_enabled=True,
    ltc_tau_sys=15.0,       # Slower integration
    base_plasticity=0.15,
    forgetting_rate=0.005,  # Slow forgetting
)

Anomaly Detection

config = DREAMConfig(
    input_dim=20,
    hidden_dim=128,
    rank=16,
    ltc_enabled=True,
    base_threshold=0.3,     # More sensitive
    surprise_temperature=0.15,
    base_plasticity=0.2,    # Fast adaptation
)

Language Modeling

config = DREAMConfig(
    input_dim=300,          # Embedding dimension
    hidden_dim=512,
    rank=32,
    ltc_enabled=True,
    ltc_tau_sys=20.0,       # Long-term dependencies
    forgetting_rate=0.001,  # Very slow forgetting
    target_norm=3.0,
)

Online Learning

config = DREAMConfig(
    input_dim=64,
    hidden_dim=256,
    rank=16,
    ltc_enabled=True,
    ltc_surprise_scale=20.0,  # High adaptivity
    base_plasticity=0.2,
    forgetting_rate=0.02,     # Faster forgetting
    base_threshold=0.4,
)

Tuning Workflow

Step 1: Start with Defaults

config = DREAMConfig(
    input_dim=64,
    hidden_dim=256,
    rank=16,
)

Step 2: Monitor Key Metrics

# During training
print(f"Loss: {loss.item():.4f}")
print(f"Surprise: {state.avg_surprise.mean().item():.4f}")
print(f"U norm: {state.U.norm().item():.4f}")

Step 3: Adjust Based on Behavior

SymptomAdjustment
Loss not decreasingIncrease base_plasticity
Too much oscillationDecrease time_step, increase ltc_tau_sys
Forgets too quicklyDecrease forgetting_rate
Doesn't adaptIncrease base_plasticity, decrease base_threshold
Unstable trainingDecrease time_step, decrease target_norm

Step 4: Fine-tune

Once stable, fine-tune for your specific metric:

  • Accuracy: Increase hidden_dim, rank
  • Speed: Decrease hidden_dim, rank
  • Memory: Decrease rank, enable sleep consolidation
  • Adaptation: Increase base_plasticity, ltc_surprise_scale

Next Steps

On this page