Author:
Xu Qing,Xuan Xiaohua (Michael)
Abstract
Abstract
In this paper, we consider a class of nonlinear regression problems without the assumption of being independent and identically distributed. We propose a correspondent mini-max problem for nonlinear regression and give a numerical algorithm. Such an algorithm can be applied in regression and machine learning problems, and yields better results than traditional least squares and machine learning methods.
Publisher
American Institute of Mathematical Sciences (AIMS)
Reference17 articles.
1. Ben-Israel, A., Greville, T. N. E.: Generalized inverses: Theory and applications (2nd ed.)Springer, New York (2003).
2. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers. Found. Trends Mach. Learn. 3, 1–122 (2010).
3. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press (2004).
https://doi.org/10.1017/cbo9780511804441.005
.
4. Demyanov, V. F., Malozemov, V. N.: Introduction to Minimax. Wiley, New York (1977).
5. Jin, H., Peng, S.: Optimal Unbiased Estimation for Maximal Distribution (2016).
https://arxiv.org/abs/1611.07994
.
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献