Causality and causal inference for engineers: Beyond correlation, regression, prediction and artificial intelligence

Author:

Naser M. Z.12ORCID

Affiliation:

1. School of Civil & Environmental Engineering and Earth Sciences (SCEEES) Clemson University Clemson South Carolina USA

2. Artificial Intelligence Research Institute for Science and Engineering (AIRISE) Clemson University Clemson South Carolina USA

Abstract

AbstractIn order to engineer new materials, structures, systems, and processes that address persistent challenges, engineers seek to tie causes to effects and understand the effects of causes. Such a pursuit requires a causal investigation to uncover the underlying structure of the data generating process (DGP) governing phenomena. A causal approach derives causal models that engineers can adopt to infer the effects of interventions (and explore possible counterfactuals). Yet, and for the most part, we continue to design experiments in the hope of empirically observing engineered intervention(s). Such experiments are idealized, complex, and costly and hence are narrow in scope. On the contrary, a causal investigation will allow us to peek into the how and why of a DGP and provide us with the essential means to articulate a causal model that accurately describes the phenomenon on hand and better predicts the outcome of possible interventions. Adopting a causal approach in engineering is perhaps more warranted than ever—especially with the rise of big data and the adoption of artificial intelligence (AI); wherein AI models are naivety presumed to describe causal ties. To bridge such knowledge gap, this primer presents fundamental principles behind causal discovery, causal inference, and counterfactuals from an engineering perspective and contrasts that to those pertaining to correlation, regression, and AI.This article is categorized under: Application Areas > Industry Specific Applications Algorithmic Development > Causality Discovery Application Areas > Science and Technology Technologies > Machine Learning

Publisher

Wiley

Reference34 articles.

1. In reference to fig 4 a question may arise as to how to distinguish a confounder from a collider. This task can be completed via theback‐doorandfront‐door adjustments. The back‐door adjustment states that (1) no node inZis a descendent ofX and (2) any path betweenXandYthat begins with an arrow intoX(known as a back‐door path) is blocked byZ then controlling forZblocks the non‐causal paths. If the confounder isunobservable/theoretical then the front‐door adjustment can be applied. In this process we add a new variable that we may assume it is not caused directly by the confounder and then apply the back‐door adjustment to estimate the effect of the new variable.

2. Other assumptions also exist such as Gaussianity of the noise distribution one or several experimental settings and linearity/nonlinearity acyclicity—see (Heinze‐Deml et al.  2018).

3. For additional metrics for evaluating DAGs please see (Nogueira et al.  2022; Shi et al.  2022).

4. For a more detailed discussion on causality from the intersection of philosophy epistemology and ontology please refer to (Michotte  2017; Salmon  2003).

5. In the event where an event is time dependent then other approaches such as Granger causality or Sim causality can be applied examined. Please refer to (Granger  1969; Sims  1972).

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3