Options
INEVITABLE ENERGY COSTS OF STORAGE CAPACITY ENHANCEMENT IN AN OSCILLATORY NEURAL NETWORK
Date Issued
01-01-2003
Author(s)
Udayshankar, M.
Chakravarthy, V. S.
Mohan, Vishwanathan
Abstract
Is there an inevitable energy cost to computation? Raising this important question, Landauer (1961) argued that irreversible computational processes have an inevitable 'thermodynamic cost", suggesting deep link between the amount of energy spent by a computing device and its "informational work."Our previous studies [1] on the possibility of such a link in neural network models showed a consistent correlation between energy dissipated and performance in a Hopfield neural network (HNN). In the present paper, we demonstrate a similar result in a Complex Hopfield Neural Network (CHNN) [2, 3], an associative memory in which patterns are stored as oscillations. However, perfect retrieval is observed when only a single pattern is stored. When multiple patterns are stored, the network often wanders from one stored pattern to another without settling on any single pattern, resulting in unacceptably low storage capacity. We found that using weights that adapt even during retrieval dramatically enhances storage capacity. However, this enhanced capacity has an energetic cost. Comparing circuit implementations of the network with fixed and adaptive weights, we found that the latter case involves greater power dissipation. The same result is confirmed over a range of P, the number of patterns stored in the network.
Volume
2