Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : APTIKOM Journal on Computer Science and Information Technologies (CSIT)

EXTRACT TRANSFORM LOAD (ETL) PROCESS IN DISTRIBUTED DATABASE ACADEMIC DATA WAREHOUSE Yulianto, Ardhian Agung
APTIKOM Journal on Computer Science and Information Technologies Vol 4 No 2 (2019): APTIKOM Journal on Computer Science and Information Technologies (CSIT)
Publisher : APTIKOM Publisher

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

While a data warehouse is designed to support the decision-making function, the most time-consuming partis the Extract Transform Load (ETL) process. Case in Academic Data Warehouse, when data source came from thefaculty?s distributed database, although having a typical database but become not easier to integrate. This paperpresents how to an ETL process in distributed database academic data warehouse. Following Data Flow Threadprocess in the data staging area, a deep analysis performed for identifying all tables in each data sources, includingcontent profiling. Then the cleaning, confirming, and data delivery steps pour the different data source into the datawarehouse (DW). Since DW development using bottom-up Kimball?s multidimensional approach, we found the threetypes of extraction activities from data source table: merge, merge-union, and union. Result for cleaning andconforming step set by creating conform dimension on data source analysis, refinement, and hierarchy structure. Thefinal of the ETL step is loading it into integrating dimension and fact tables by a generation of a surrogate key. Thoseprocesses are running gradually from each distributed database data sources until it incorporated. This technicalactivity in distributed database ETL process generally can be adopted widely in other industries which designer musthave advance knowledge to structure and content of data source.