Neural Network Identifiability for a Family of Sigmoidal Nonlinearities

Author:

Vlačić Verner,Bölcskei Helmut

Abstract

AbstractThis paper addresses the following question of neural network identifiability: Does the input–output map realized by a feed-forward neural network with respect to a given nonlinearity uniquely specify the network architecture, weights, and biases? The existing literature on the subject (Sussman in Neural Netw 5(4):589–593, 1992; Albertini et al. in Artificial neural networks for speech and vision, 1993; Fefferman in Rev Mat Iberoam 10(3):507–555, 1994) suggests that the answer should be yes, up to certain symmetries induced by the nonlinearity, and provided that the networks under consideration satisfy certain “genericity conditions.” The results in Sussman (1992) and Albertini et al. (1993) apply to networks with a single hidden layer and in Fefferman (1994) the networks need to be fully connected. In an effort to answer the identifiability question in greater generality, we derive necessary genericity conditions for the identifiability of neural networks of arbitrary depth and connectivity with an arbitrary nonlinearity. Moreover, we construct a family of nonlinearities for which these genericity conditions are minimal, i.e., both necessary and sufficient. This family is large enough to approximate many commonly encountered nonlinearities to within arbitrary precision in the uniform norm.

Funder

ETH Zurich

Publisher

Springer Science and Business Media LLC

Subject

Computational Mathematics,General Mathematics,Analysis

Reference21 articles.

1. Sussman, H.J.: Uniqueness of the weights for minimal feedforward nets with a given input–output map. Neural Netw. 5(4), 589–593 (1992)

2. Albertini, F., Sontag, E.D., Maillot, V.: Uniqueness of weights for neural networks. In: Mammone, R.J. (ed.) Artificial Neural Networks for Speech and Vision, pp. 113–125. Chapman and Hall, London (1993)

3. Fefferman, C.: Reconstructing a neural net from its output. Rev. Mat. Iberoam. 10(3), 507–555 (1994)

4. LeCun, Y., Jackel, L.D., Bottou, L., Brunot, A., Cortes, C., Denker, J.S., Drucker, H., Guyon, I., Müller, U.A., Säckinger, E., Simard, P., Vapnik, V.: Comparison of learning algorithms for handwritten digit recognition. In: International Conference on Artificial Neural Networks, pp. 53–60 (1995)

5. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems 25. Curran Associates, Inc., 2012, pp. 1097–1105

Cited by 5 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Approximation error for neural network operators by an averaged modulus of smoothness;Journal of Approximation Theory;2023-10

2. On the Space of Coefficients of a Feedforward Neural Network;2023 International Joint Conference on Neural Networks (IJCNN);2023-06-18

3. Neural Network Independence Properties with Applications to Adaptive Control;2022 IEEE 61st Conference on Decision and Control (CDC);2022-12-06

4. Metric entropy limits on recurrent neural network learning of linear dynamical systems;Applied and Computational Harmonic Analysis;2021-12

5. Affine symmetries and neural network identifiability;Advances in Mathematics;2021-01

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3