Abstract
There is an ever-growing mismatch between the proliferation of data-intensive, power-hungry deep learning solutions in the machine learning (ML) community and the need for agile, portable solutions in resource-constrained devices, particularly for intelligence at the edge. In this paper, we present a fundamentally novel approach that leverages data-driven intelligence with biologically-inspired efficiency. The proposed Sparse Embodiment Neural-Statistical Architecture (SENSA) decomposes the learning task into two distinct phases: a training phase and a hardware embedment phase where prototypes are extracted from the trained network and used to construct fast, sparse embodiment for hardware deployment at the edge. Specifically, we propose the Sparse Pulse Automata via Reproducing Kernel (SPARK) method, which first constructs a learning machine in the form of a dynamical system using energy-efficient spike or pulse trains, commonly used in neuroscience and neuromorphic engineering, then extracts a rule-based solution in the form of automata or lookup tables for rapid deployment in edge computing platforms. We propose to use the theoretically-grounded unifying framework of the Reproducing Kernel Hilbert Space (RKHS) to provide interpretable, nonlinear, and nonparametric solutions, compared to the typical neural network approach. In kernel methods, the explicit representation of the data is of secondary nature, allowing the same algorithm to be used for different data types without altering the learning rules. To showcase SPARK’s capabilities, we carried out the first proof-of-concept demonstration on the task of isolated-word automatic speech recognition (ASR) or keyword spotting, benchmarked on the TI-46 digit corpus. Together, these energy-efficient and resource-conscious techniques will bring advanced machine learning solutions closer to the edge.
Funder
Defense Advanced Research Projects Agency
Reference47 articles.
1. Inductive Inference: Theory and Methods;Angluin;ACM Comput. Surv.,1983
2. Theory of Reproducing Kernels;Aronszajn;Trans. Amer. Math. Soc.,1950
3. Kernel Methods for Deep Learning;Cho,2009
4. Three Models for the Description of Language;Chomsky;IEEE Trans. Inform. Theor.,1956
5. Cisco Annual Internet Report (2018–2023) White Paper,2020
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献