Options
Flip Flop Neural Networks: Modelling Memory for Efficient Forecasting
Date Issued
01-01-2021
Author(s)
Sujith Kumar, S.
Vigneswaran, C.
Srinivasa Chakravarthy, V.
Abstract
Flip flops circuits can memorize information with the help of their bi-stable dynamics. Inspired by the flip flop circuits used in digital electronics, in this work we define a flip flop neuron and construct a neural network endowed with memory. Flip flop neural networks (FFNNs) function like recurrent neural networks (RNNs) and therefore are capable of processing temporal information. To validate FFNNs competency on sequential processing, we solved benchmark time series prediction and classification problems with different domains. Three datasets are used for time series prediction: (1) household power consumption, (2) flight passenger prediction and (3) stock price prediction. As an instance of time series classification, we select indoor movement classification problem. The FFNN performance is compared with RNNs consisting of long short-term memory (LSTM) units. In all the problems, the FFNNs show either show superior or near equal performance compared to LSTM. Flips flops shall also potentially be used for harder sequential problems, like action recognition and video understanding.
Volume
749 LNEE
Subjects