FEA-AI and AI-AI: Two-Way Deepnets for Real-Time Computations for Both Forward and Inverse Mechanics Problems

Author:

Liu G. R.12

Affiliation:

1. Department of Aerospace Engineering and Engineering Mechanics, University of Cincinnati, Cincinnati, OH 45221-0070, USA

2. Summer Part-Time Professor, School of Mechanical Engineering, Hebei University of Technology, P. R. China

Abstract

Recent breakthroughs in deep-learning algorithms enable dreams of artificial intelligence (AI) getting close to reality. AI-based technologies are now being developed rapidly, including service and industrial robots, autonomous and self-driving vehicles. This work proposes Two-Way Deepnets (TW-Deepnets) trained using the physics-law-based models such as finite element method (FEM), smoothed FEM (S-FEM), and meshfree models, for real-time computations of both forward and inverse mechanics problems of materials and structures. First, unique features of physics-law-based models and data-based models are analyzed in theory. The training characteristics of deepnets for forward problems governed by physics-laws are then investigated, when an FEM (or S-FEM) model is used as the trainer. The training convergence rates of such an FEM-AI model are examined in relation to the property of the system matrix of the FEM model for deepnets. Next, a study on the training characteristics of deepnets for inverse problems, when the forward FEM-trained AI Deepnets are used as the trainer to train an AI model for inverse analyses. Next, a discussion is conducted on the roles of regularization techniques to overcome the ill-posedness of inverse problems in deepnet structures for noisy data. Finally, TW-Deepnets (FEM-AI and AI-AI models) are presented for real-time analyses of both forward and inverse problems of materials and structures with high-dimensional parameter space. The major finding of this study is as follows: (1) The understandings on the fundamental features of both data-based and physics-based methods is critical for creations of novel game-changing computational methods, which take advantages of both types of methods; (2) The good property of the system matrix of FEM allows effective training of FEM-AI deepnets for forward mechanics problems; (3) Our new technique to training inverse deepnets using FEM-AI deepnets as a surrogate model offers an innovative means, to effectively train deepnets for solving inverse mechanics problems; (4) The TW-Deepnets is capable of performing real-time analysis of both forward and inverse problems of materials and structures with high-dimensional parameter spaces; (5) Such TW-Deepnets can be easily utilized by the mass: a transformative new concept of AI-enabling democratization of complicated computational technology in modeling and simulation.

Publisher

World Scientific Pub Co Pte Lt

Subject

Computational Mathematics,Computer Science (miscellaneous)

Cited by 21 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. A Study on Affine Transformations and a Novel Universal Prediction Theory;International Journal of Computational Methods;2023-04-17

2. Different Types of Constitutive Parameters Red Blood Cell Membrane Based on Machine Learning and FEM;International Journal of Computational Methods;2022-12-28

3. Adaptive Learning Rate Residual Network Based on Physics-Informed for Solving Partial Differential Equations;International Journal of Computational Methods;2022-11-21

4. Neurons-Samples Theorem;International Journal of Computational Methods;2022-11-21

5. Numerical performances through artificial neural networks for solving the vector-borne disease with lifelong immunity;Computer Methods in Biomechanics and Biomedical Engineering;2022-11-14

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3