Author:
Skurihin Alexei N.,Surkan Alvin J.
Abstract
Neural networks, trained by backpropagation, are designed and described in the language
J,
an
APL
derivative with powerful function encapsulation features. Both the languages
J
[4,6,7] and
APL
[5] help to identify and isolate the parallelism that is inherent in network training algorithms. Non-critical details of data input and derived output processes are de-emphasized by relegating those functions to callable stand-alone modules. Such input and output modules can be isolated and customized individually for managing communication with arbitrary, external storage systems. The central objective of this research is the design and precise description of a neural network training kernel. Such kernel designs are valuable for producing efficient reusable computer codes and facilitating the transfer of neural network technology from developers to users.
Publisher
Association for Computing Machinery (ACM)
Reference11 articles.
1. Function arrays
2. Gerunds and representations
3. Frey R.J. "Object Oriented Extensions to APL and J" Vector Vol.9 No.2 1992 Frey R.J. "Object Oriented Extensions to APL and J" Vector Vol.9 No.2 1992
4. APL\?
5. A personal view of APL
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献