Affiliation:
1. Computer Science Department, Brigham Young University, Provo, Utah 84602, USA
Abstract
Most Artificial Neural Networks (ANNs) have a fixed topology during learning, and often suffer from a number of short-comings as a result. ANNs that use dynamic topologies have shown the ability to overcome many of these problems. Adaptive Self-Organizing Concurrent Systems (ASOCS) are a class of learning models with inherently dynamic topologies. This paper introduces Location-Independent Transformations (LITs) as a general strategy for implementing learning models that use dynamic topologies efficiently in parallel hardware. An LIT creates a set of location-independent nodes, where each node computes its part of the network output independent of other nodes, using local information. This type of transformation allows efficient support for adding and deleting nodes dynamically during learning. In particular, this paper presents the Location-Independent ASOCS (LIA) model as an LIT for ASOCS Adaptive Algorithm 2. The description of LIA gives formal definitions for LIA algorithms. Because LIA implements basic ASOCS mechanisms, these definitions provide a formal description of basic ASOCS mechanisms in general, in addition to LIA.
Publisher
World Scientific Pub Co Pte Lt
Subject
Computer Networks and Communications,General Medicine