Affiliation:
1. School of Information and Physical Sciences, The University of Newcastle, Callaghan, NSW 2308, Australia
2. ResTech Pty Ltd., CE Building, Design Drive, Callaghan, NSW 2308, Australia
3. Bill & Melinda Gates Center, University of Washington, 3800 E Stevens Way NE, Seattle, WA 98195, USA
4. California Institute of Technology, 1200 E California Blvd., M/C 221-C1, Pasadena, CA 91106, USA
Abstract
In the field of Artificial Intelligence (AI) and Machine Learning (ML), a common objective is the approximation of unknown target functions y=f(x) using limited instances S=(x(i),y(i)), where x(i)∈D and D represents the domain of interest. We refer to S as the training set and aim to identify a low-complexity mathematical model that can effectively approximate this target function for new instances x. Consequently, the model’s generalization ability is evaluated on a separate set T={x(j)}⊂D, where T≠S, frequently with T∩S=∅, to assess its performance beyond the training set. However, certain applications require accurate approximation not only within the original domain D but in an extended domain D′ that encompasses D as well. This becomes particularly relevant in scenarios involving the design of new structures, where minimizing errors in approximations is crucial. For example, when developing new materials through data-driven approaches, the AI/ML system can provide valuable insights to guide the design process by serving as a surrogate function. Consequently, the learned model can be employed to facilitate the design of new laboratory experiments. In this paper, we propose a method for multivariate regression based on iterative fitting of a continued fraction, incorporating additive spline models. We compare the performance of our method with established techniques, including AdaBoost, Kernel Ridge, Linear Regression, Lasso Lars, Linear Support Vector Regression, Multi-Layer Perceptrons, Random Forest, Stochastic Gradient Descent, and XGBoost. To evaluate these methods, we focus on an important problem in the field, namely, predicting the critical temperature of superconductors based on their physical–chemical characteristics.
Subject
Computational Mathematics,Computational Theory and Mathematics,Numerical Analysis,Theoretical Computer Science
Reference27 articles.
1. Tinkham, M. (1975). Introduction to Superconductivity: International Series in Pure and Applied Physics, McGraw-Hill.
2. Tinkham, M. (2004). Introduction to Superconductivity, Courier Corporation. [2nd ed.].
3. Enhanced superconductivity in the Se-substituted 1T-PdTe2;Liu;Phys. Rev. Mater.,2021
4. Superconductivity at 43 K in SmFeAsO(1−x)Fx;Chen;Nature,2008
5. Unprecedented high irreversibility line in the nontoxic cuprate superconductor (Cu, C)Ba2Ca3Cu4O(11+);Zhang;Sci. Adv.,2018
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献