Multi-periodic neural coding for adaptive information transfer

Yongseok Yoo, O. Ozan Koyluoglu, Sriram Vishwanath, Ila Fiete

Research output: Contribution to journalArticle

1 Scopus citations

Abstract

Information processing in the presence of noise has been a key challenge in multiple disciplines including computer science, communications, and neuroscience. Among such noise-reduction mechanisms, the shift-map code represents an analog variable by its residues with respect to distinct moduli (that are chosen as geometric scalings of an integer). Motivated by the multi-periodic neural code in the entorhinal cortex, i.e., the coding mechanism of grid cells, this work extends the shift-map code by generalizing the choices of moduli. In particular, it is shown that using similarly sized moduli (for instance, evenly and closely spaced integers, which tend to have large co-prime factors) results in a code whose codewords are separated in an interleaving way such that when the decoder has side information regarding the source, then error control is significantly improved (compared to the original shift map code). This novel structure allows the system to dynamically adapt to the side information at the decoder, even if the encoder is not privy to the side information. A geometrical interpretation of the proposed coding scheme and a method to find such codes are detailed. As an extension, it is shown that this novel code also adapts to scenarios when only a fraction of codeword symbols is available at the decoder.

Original languageEnglish (US)
Pages (from-to)37-53
Number of pages17
JournalTheoretical Computer Science
Volume633
DOIs
StatePublished - Jun 20 2016

Keywords

  • Grid cells
  • Multi-periodic neural codes
  • Shift-map codes
  • Side information
  • The entorhinal cortex

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Multi-periodic neural coding for adaptive information transfer'. Together they form a unique fingerprint.

  • Cite this