Affiliation:
1. JST-ICORP Computational Brain Project and ATR Computational Neuroscience Laboratory, 2-2-2 Hikari-dai Soraku-gun, Kyoto, 619-0288, Japan,
Abstract
It is known that any dichotomy of {−1, 1}n can be learned (separated) with a higher-order neuron (polynomial function) with 2n inputs (monomials). In general, less than 2n monomials are sufficient to solve a given dichotomy. In spite of the efforts to develop algorithms for finding solutions with fewer monomials, there have been relatively fewer studies investigating maximum density (II(n)), the minimum number of monomials that would suffice to separate an arbitrary dichotomy of {−1, 1}n . This article derives a theoretical (upper) bound for this quantity, superseding previously known bounds. The main theorem here states that for any binary classification problem in {−1, 1}n (n > 1), one can always find a polynomial function solution with 2n −2n/4 or fewer monomials. In particular, any dichotomy of {−1, 1}n can be learned by a higher-order neuron with a fan-in of 2n −2n/4 or less. With this result, for the first time, a deterministic ratio bound independent of n is established as II (n)/2n ≤ 0 75. The main theorem is constructive, so it provides a deterministic algorithm for achieving the theoretical result. The study presented provides the basic mathematical tools and forms the basis for further analyses that may have implications for neural computation mechanisms employed in the cerebral cortex.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献