Abstract
AbstractDeep nets are becoming larger and larger in practice, with no respect for (non)-factors that ought to limit growth including the so-called curse of dimensionality (CoD). Donoho suggested that dimensionality can be a blessing as well as a curse. Current practice in industry is well ahead of theory, but there are some recent theoretical results from Weinan E’s group suggesting that errors may be independent of dimensions$d$. Current practice suggests an even stronger conjecture: deep nets are not merely immune to CoD, but actually, deep nets thrive on scale.
Publisher
Cambridge University Press (CUP)
Subject
Artificial Intelligence,Linguistics and Language,Language and Linguistics,Software
Reference74 articles.
1. Emerging trends: Deep nets for poets
2. Fant, G. (1973). Speech sounds and features.
3. The Future of Computational Linguistics: On Beyond Alchemy
4. Sanh, V. , Debut, L. , Chaumond, J. and Wolf, T. (2019). Distilbert, a distilled version of bert: Smaller, faster, cheaper and lighter. arXiv preprint arXiv: 1910.01108.
5. Emerging Trends: SOTA-Chasing
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Some Useful Things to Know When Combining IR and NLP: The Easy, the Hard and the Ugly;Proceedings of the 17th ACM International Conference on Web Search and Data Mining;2024-03-04
2. Some Useful Things to Know When Combining IR and NLP: the Easy, the Hard and the Ugly;Proceedings of the 32nd ACM International Conference on Information and Knowledge Management;2023-10-21