Abstract
AbstractThis paper proposes a technique to identify nonlinear dynamical systems with time delay. The sparse optimization algorithm is extended to nonlinear systems with time delay. The proposed algorithm combines cross-validation techniques from machine learning for automatic model selection and an algebraic operation for preprocessing signals to filter the noise and for removing the dependence on initial conditions. We further integrate the bootstrapping resampling technique with the sparse regression to obtain the statistical properties of estimation. We use Taylor expansion to parameterize time delay. The proposed algorithm in this paper is computationally efficient and robust to noise. A nonlinear Duffing oscillator is simulated to demonstrate the efficiency and accuracy of the proposed technique. An experimental example of a nonlinear rotary flexible joint is presented to further validate the proposed method.
Publisher
Springer Science and Business Media LLC
Subject
Electrical and Electronic Engineering,Control and Optimization,Mechanical Engineering,Modeling and Simulation,Civil and Structural Engineering,Control and Systems Engineering
Cited by
17 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献