Arriving at the Results by Comparing with Traditional Approach to My Approach in Deriving Function Points for ETL Operations

Author:

Phanindra A. RakeshORCID, ,Narasimha Dr. V. B.,

Abstract

It can be hard to guess how much data will need to be put into the data warehouse when the whole history of the transaction system is moved there. This is especially true when the transfer process could take weeks or even months. The ETL system's parts must be broken down into its three independent stages, nevertheless, when estimating a big starting load. Data extraction from source systems, Creating the dimensional model from data, Loading the data warehouse and timing estimates for the extraction process. Surprisingly, data extraction from the source system may take up the majority of the ETL procedure. Online transaction processing (OLTP) systems are simply not built to return those massive data sets from the data warehouse's historic load, which extracts a tremendous quantity of data in a single query. However, the daily incremental loads and the breath-of-life historic database loads are very different. In any case, fact-table filling requires data to be pulled in a different way than what transaction systems are able to do. ETL extraction procedures frequently call for time-consuming techniques like views, cursors, stored procedures, and correlated subqueries. It is essential to anticipate how long an extract will take to begin before it does. Calculating the extract time estimate is challenging. Due to the hardware mismatch between the test and production servers, estimates based on the execution of the ETL operations in the test environment may be greatly distorted. Sometimes working on certain projects where an extract task would run continuously and until it eventually failed, at which point it would be restarted and run once more until it failed. Without producing anything, days or even weeks passed. One must divide the extract process into two simpler steps in order to overcome the challenges of working with large amounts of data. Response time for queries. the interval between when the query is conducted and when the data starts to be returned. It is pertinent that effort arrival for ETL Operations for Data Marts and DWH projects in terms of Function Points which is a scientific way is essential. In the last paper, I have talked about general System Characteristics to arrive at Value Adjustment Factor. In this paper, I came up with results. I compared my findings with the conventional FPA on industrial projects in order to evaluate the Function Point Analysis’s suitability for Data Mart projects. I outline the strategy, implementation, and outcomes analysis of this validation in this section.

Publisher

Blue Eyes Intelligence Engineering and Sciences Engineering and Sciences Publication - BEIESP

Subject

Management of Technology and Innovation,General Engineering

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3