AI Glossary

Reservoir Computing

A framework using a fixed random recurrent network (reservoir) with only the output layer trained.

Overview

Reservoir computing is a computational framework where input signals are fed into a large, fixed, randomly connected recurrent neural network called the reservoir. Only the output weights are trained, making the approach computationally efficient compared to training full recurrent neural networks.

Key Details

The reservoir transforms inputs into high-dimensional representations that capture temporal dynamics. Echo State Networks (ESNs) and Liquid State Machines are two main variants. Reservoir computing excels at time-series prediction, speech recognition, and signal processing tasks, and has inspired neuromorphic hardware implementations.

Related Concepts

recurrent neural networkneural networklong short term memory

← Back to AI Glossary

Last updated: March 5, 2026