Affiliation:
1. George Washington University, Washington DC, USA
Abstract
The geometric topology of a point per event written in the higher dimensional μ-space of data (e.g. 6W's: who, where, when, what, how, and why) can help in the design of information acquisition (IA) systems. Measurement intensity of each W's sensor, or the number of words used to describe a specific W's attribute, represents the length of each vector dimension. Then, N concurrent reports of the same event become a distribution set of N points scattered all over μ-space. To discover the statistically independent components, an unsupervised or unbiased Artificial Neural Networks (ANN) methodology called Independent Component Analysis (ICA) can be used to reveal a new subspace called the feature space. The major and minor axes of the subspace correspond to highly precise and efficient combinations of old attributes (e.g. 2-D feature domains consisting of "where-who-when" and "what-how-why" could be good choices for Internet search indices). Thus, one realizes that the communication of an event is not just the address-where: but who and when are equally important attributes. In principle, the number of new sensors can be reduced (e.g. from 6 W's to 2 features), provided that they are physically realizable. In the combined space of 6N-dimensional Γ-space, one point can represent all N concurrent measurements; the flow of these generates the event behavior in time. The time flow over the reduced 2N feature space generates invariant features called knowledge. For surveillance against terrorists, legacy electrical power line communication (PLC) will offer a useful relay for the last mile of mobile communications for a Surveillance Sensor Web (SSW) employing ANN: there is no need for "where" addressing for switching because of smart coding and decoding of "who-when." After reviewing Auto-Regression (AR), we generalize AR to a supervised ANN implementation of Principal Component Analysis (PCA) (Appendix A) learning toward unsupervised learning ANN for ICA (Appendix B). This is possible non-statistically because the classical-closed information theory (CIT) of the maximum Shannon entropy S of a closed system must be generalized for open brain information theory (BIT) having non-zero energy exchange E at the minimum Helmholtz free energy H = E - T o S at isothermal equilibrium ( T o =37° C ). For such an open BIT system, we prove the Lyaponov convergence theorem. We compute the ICA features of image textures in order to measure the ICA classifier information content.
Publisher
World Scientific Pub Co Pte Lt
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献