Pseudo-orthogonalization of memory patterns for associative memory

Makito Oku*, Takaki Makino, Kazuyuki Aihara

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

A new method for improving the storage capacity of associative memory models on a neural network is proposed. The storage capacity of the network increases in proportion to the network size in the case of random patterns, but, in general, the capacity suffers from correlation among memory patterns. Numerous solutions to this problem have been proposed so far, but their high computational cost limits their scalability. In this paper, we propose a novel and simple solution that is locally computable without any iteration. Our method involves XNOR masking of the original memory patterns with random patterns, and the masked patterns and masks are concatenated. The resulting decorrelated patterns allow higher storage capacity at the cost of the pattern length. Furthermore, the increase in the pattern length can be reduced through blockwise masking, which results in a small amount of capacity loss. Movie replay and image recognition are presented as examples to demonstrate the scalability of the proposed method.

Original languageEnglish
Article number6553073
Pages (from-to)1877-1887
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume24
Issue number11
DOIs
StatePublished - 2013

Keywords

  • Artificial neural networks
  • XNOR
  • associative memory
  • image processing
  • pseudo-orthogonalization
  • storage capacity

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Pseudo-orthogonalization of memory patterns for associative memory'. Together they form a unique fingerprint.

Cite this