Optimizing the Reservoir of a Recurrent Neural Network Using Gradient-Free Methods
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Reservoir computing is a framework for training recurrent neural networks in which the recurrent weight matrix, called the reservoir, is typically randomly generated, and only the output weights are trained. This approach has proven effective in applications like speech recognition and chaotic time series prediction, utilizing the reservoir's ability to capture temporal dependencies and its high-dimensional state space. Despite the success of this paradigm, we postulate that a randomly generated reservoir that has weights scaled as