AI Glossary

State Space Model

A sequence model architecture based on continuous state space representations, offering linear-time computation.

Overview

State space models (SSMs) are a class of sequence models inspired by continuous-time dynamical systems, offering an alternative to transformers for processing sequential data. They model sequences through hidden states that evolve according to learned linear dynamics, combined with nonlinear transformations.

Key Details

SSMs like S4 (Structured State Spaces for Sequences) and Mamba achieve linear computational complexity with sequence length (vs quadratic for standard attention), making them efficient for very long sequences. Mamba adds selective (input-dependent) state transitions, achieving transformer-level performance on language tasks. SSMs are increasingly used for long-context modeling, genomics, audio processing, and as building blocks in hybrid architectures.

Related Concepts

mambatransformerrecurrent neural network

← Back to AI Glossary

Last updated: March 5, 2026