Articles
Bussines Intelligent in Telemarketing Using SVM
Putu Agung Ananta Wijaya;
Komang Budiarta;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 2 No 1 (2017): January - June
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
Direct marketing provides an advantage in approaching consumers. Communication that happens allows us more closely, able to change the behavior and know the needs required by consumers accurately. But this technique has a lack of time. It takes a long time to convince consumers to buy the products offered. Bussines intelligent with data mining approach to consumer data is required. This process will analyze the potential possessed by a consumer. At the stage of the DSS used SVM method to predict whether consumers will buy products that have been offered. Bussines intelligent built proven able to predict consumers who have the potential to buy products. Tests show the greatest prediction accuracy rate is 89.5% with a combination of data traning of 70% of the dataset.
Middleware ETL with CDC based on Event Driven Programming
I Gede Adnyana;
Made Sudarma;
Wayan Gede Ariastina
International Journal of Engineering and Emerging Technology Vol 3 No 2 (2018): July - December
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
To achieve the real time data warehouse is strongly influenced by the process in the data warehouse known as Extract, Tranform, Loading (ETL). One way to optimize the ETL process is processing only data that undergoes changes on the On Line Transaction Processing (OLTP) system. This technique is known as Change Data Capture (CDC) which is designed to maximize the efficiency of the ETL process. In this research middleware was built as a place where the ETL process will be carried out, transaction data from the OLTP system will be captured and sent directly to the middleware for further processing. The method used to capture changes in OLTP systems is Change Data Capture (CDC) based on Event Driven Programming, where this technique relies on events that occur in OLTP in capturing data changes. Functional testing is done by making a simulation of the insert and update processes in test applications namely OLTP CRM system.The results of the research obtained are (1) Change Data Capture (CDC) based on Event Driven Programming can capture changes in data that occur in OLTP CRM database;(2) ETL process to load data from Normalized Data Store (NDS ) to data warehouse with Timestamp technique can load data that only undergoes changes that are processed to be loaded in Data Warehouse;(3) An increase in the amount of data that is processed has an effect on increasing processing time. Other factors that affect the value of process time are execution plan and cache memory
Analysis Of Data Warehouse Design Using Powell Method
Hisyam Rahmawan Suharno;
Nyoman Gunantara;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 5 No 2 (2020): July - December
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.24843/IJEET.2020.v05.i02.p03
With the evolution in this digital era, many industrial organizations and companies have begun to move towards digitization to increase the company's business opportunities. Data is something that is very useful in a company's business. If the dataprocessed correctly can provide a variety of information needed by the company to continue to grow. Now Data also becomes digital and data processing have many techniques and can provide us with a decision support for the information generated by the data. The data processing is usually called Data Warehouse. In running a business, business owners must certainly analyze a number of things so that the business continues to run and grow, including one of which is a fabric business in Bali, namely CV Phalani Bali. CV Phalani Bali still needs a centralized system that integrates sales data from online stores and offline stores, therefore a data warehouse is needed that can help manage all these data and make it a new information needed by CV Phalani Bali. With the data warehouse, it can help the owner of CV Phalani Bali in reporting and historical information of the business they run. Helps manage historical data and provides strategic information to support evaluation and take decision analysis at the executive level. So one of the data warehouse design methods is used, is the Powell method. This Powell method focuses on the ETL (Extract, Transform, Load) process to become a data warehouse that is ready to be processed by OLAP (Online Analytical Processing). This Powell method will be assisted by Microsoft SQL Server Business Intelligence as a tool that will design and process the sales data into a data warehouse that will produce the information needed by CV Phalani Bali for analyze and make decisions to bring cv phalani bali even more advanced.
Audit E-Signature Public Service Project Using Knowledge Quality Management
Gde Brahupadhya Subiksa;
Kadek Ary Budi Permana;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 2 No 2 (2017): July - December
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
Management Information System Project (E-Signature) is a collaborative project between the Department Investment and Integrated Service Denpasar City with Institute for Research and Community Service of Udayana University. This project is commenced on 21 April 2017 and must be completed on 21 August 2017, the project is supervised and controlled directly by the Section Data Processing and Investment Information System. The plan of this project will be launched in January 2018. Before the system is launched and used by permit applicants and related officials, we need to have an audit on the quality of project management. To measure the quality of project management we used one of the knowledge in the Project Management Body Framework of Knowledge (PMBOK) that is Management Quality Project. We also use Capability Maturity Model (CMM) to help define the maturity level of information systems management. With this we can know the quality of Project Management Information System Management License (E-Signature) at the Department Investment and Integrated Service Denpasar City and provide recommendations or information related to the results of the audit we have done.
Systematic Review of Text Mining Application Using Apache UIMA
Purwania Ida Bagus Gede;
I Nyoman Satya Kumara;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 5 No 2 (2020): July - December
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
|
DOI: 10.24843/IJEET.2020.v05.i02.p08
Companies are often faced with a number of data and information in the form of unstructured texts. The unstructured data set can be processed / extracted so that it can benefit the company in the decision making process or strategy that must be carried out by the company. Text Mining is one solution to overcome these problems. Text Mining can be defined as the process of retrieving information sourced from several documents. One of the most commonly used text mining tools is Apache UIMA. This study aims to systematically study literature on the implementation of text mining and Apache UIMA by using several related databases, including reviewing text mining, Apache UIMA, and reviewing journal of text mining and Apache UIMA. These journals are reduced using certain criteria. The results obtained are the 20 journals that discuss the implementation of text mining and Apache UIMA. Based on the analysis of these journals, it can be concluded that the application of Text Mining is more widely used in the field of Classification with the method often used is Naive Bayes Classifiers. The average accuracy of the method reaches more than 85%, which means the method is very effective for classification. Specifically, Apache UIMA is more widely implemented in the Information Extraction and NLP fields. The main component of Apache UIMA that is often used is the Annotator Engine and is very effectively implemented for information extraction.
Mapping Patterns Achievement Based on CRISP-DM and Self Organizing Maps (SOM) Methods
Santi Ika Murpratiwi;
A.A Ngurah Narendra;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 2 No 1 (2017): January - June
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
Successful of regional development is a reflection of the good government policy. Many programs created by the local government every period but many programs stop without results. Therefore, we need to evaluation its government programs. Evaluation of government programs must do to prevent the failure of the programs. One way to evaluate the government programs is to collect the support data to see the data model and analyzed the data. In this research will conduct an evaluation on the achievement of RPJMD performance of Bali Province by using data mining. The process of data mining used the CRISP-DM method and Self Organizing Maps (SOM) for mapping the achievement patterns. The used data is RPJMD data in the 2013-2018 period. but the data used for data mining process only 3 years data that is 2014-2016 as a middle evaluation. That data clustered into five clusters. The final result of this research are 78% of assessment indicators in the RPJMD program are inconsistent position and 22% are in an inconsistent position. Moreover, there are 84 assessment indicators that have no reached the target. From that results of data, assessment mapping can use as the guidance of Bali Province to catch up the achievement of RPJMD programs and prepare the next strategies to support the success of the RPJMD program during the RPJMD period.
Designing a Virtual Data Warehouse in Supporting Sales Information Needs (Case Study: National Scale Building Material Store X)
Andrew Sumichan;
I Made Gede Yudiana;
Muhammad Ridwan Satrio;
I Made Sudarma
International Journal of Engineering and Emerging Technology Vol 4 No 1 (2019): January - June
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
National-scale building material stores that have branches in various regions certainly want the business to run smoothly. This must be supported by good sales activities at the central store and also branch stores spread in various places. However, there is no solution to collect sales data from branch shops quickly, causing a slow response to the analysis of the increasing trend in sales of goods and goods whose sales trend has dropped. This can reduce sales competitiveness in this national scale business with similar competitors. Virtual data warehouses can help collect scattered operational data, in this case sales data located at branch stores in various locations are then collected into a data warehouse that is stored in the cloud server so that it can be processed for strategic decisions. Virtual data warehouse is a data warehouse that connects operational databases regardless of where the database is located and regardless of the format, looks as if somewhere and in a consistent format. This study produces a virtual data warehouse structure that applies the kimball nine step design method so that it produces a data warehouse schema model with the star scheme. This study explains the design of a virtual data warehouse that can facilitate analysis of sales data at national scale building material stores that have branches in various places
Decision Support System of the Employees Acceptance using Analytical Hierarchy Process (AHP) and Multi Factor Evaluation Process (MFEP)
M. Azman Maricar;
Wahyudin Wahyudin;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 1 No 1 (2016): July - December
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
In every company always need new employees of the younger and potentially. The Program management trainees intended to seek a superior candidate according to the assessment of the company, where later on the presidential candidates who passed the final selection will be given training and education about the management in general. And expected from the program management trainees are able to create a new generation with the soul of the leadership of the high. Decision support system is used to help the decision makers in determining the presidential candidates are eligible to participate in the education and training program management trainees, from five candidates are the only two candidates who received. The criteria and the weight of the specified is interview (0,558), the writing test (0,122), psycho test (0,263), and health test (0,057). The method used is Multi Factor Evaluation Process (MFEP) and Analytical Hierarchy Process (AHP). Both methods to get the same result namely candidates 2 and 4 candidates are entitled to received from the selection process. For the calculation of the consistency ratio is not there is a value criteria above 0.1 (which is specified by Saaty).
Application of Neural Network Overview In Data Mining
Rifky Lana Rahardian;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 2 No 1 (2017): January - June
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
Data Mining is the term used to describe the process extract value / information from the database. Four things are needed in order to effectively data mining: data that has a high quality, right of data, examples of which are adequate, and the correct device. To obtain valuable information in the required data mining algorithms applied in data mining in large databases. There are a lot of complex algorithms in data mining. One is the so-called Neural networks have an important role in data mining.
Audit Information System Development using COBIT 5 Framework (case Study: STMIK STIKOM Bali)
Komang Budiarta;
Adi Panca Saputra Iskandar;
Made Sudarma
International Journal of Engineering and Emerging Technology Vol 1 No 1 (2016): July - December
Publisher : Doctorate Program of Engineering Science, Faculty of Engineering, Udayana University
Show Abstract
|
Download Original
|
Original Source
|
Check in Google Scholar
Information technology (IT) is a very important part for the company or institution and investment value to make the company or institution for the better. Companies or institutions using information technology to make the support of the strategic plan company in Attaining the vision, mission and objectives of the company or the institution, as well as STMIK STIKOM Bali. Applied information technology in the companies or institute need to be regulated. Managing information technology requires an audit for evaluating and ensure compliance in terms of the standard approach. Information Technology in Bali STIKOM STMIK require audits to evaluate, assess capabilities and make a recommendation for manage information technology better. Framework COBIT 5 is use for audit, that focuses on the purpose IT service delivery in accordance with business needs. The results of audit use framework COBIT 5 with 17 process capability obtained is 2.66, which states that the implementation of IT services is already established and has a standard.