cover
Contact Name
-
Contact Email
-
Phone
-
Journal Mail Official
-
Editorial Address
-
Location
Kota yogyakarta,
Daerah istimewa yogyakarta
INDONESIA
International Journal of Advances in Intelligent Informatics
ISSN : 24426571     EISSN : 25483161     DOI : 10.26555
Core Subject : Science,
International journal of advances in intelligent informatics (IJAIN) e-ISSN: 2442-6571 is a peer reviewed open-access journal published three times a year in English-language, provides scientists and engineers throughout the world for the exchange and dissemination of theoretical and practice-oriented papers dealing with advances in intelligent informatics. All the papers are refereed by two international reviewers, accepted papers will be available on line (free access), and no publication fee for authors.
Arjuna Subject : -
Articles 330 Documents
Android skin cancer detection and classification based on MobileNet v2 model Adi Wibowo; Cahyo Adhi Hartanto; Panji Wisnu Wirawan
International Journal of Advances in Intelligent Informatics Vol 6, No 2 (2020): July 2020
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v6i2.492

Abstract

The latest developments in the smartphone-based skin cancer diagnosis application allow simple ways for portable melanoma risk assessment and diagnosis for early skin cancer detection. Due to the trade-off problem (time complexity and error rate) on using a smartphone to run a machine learning algorithm for image analysis, most of the skin cancer diagnosis apps execute the image analysis on the server. In this study, we investigate the performance of skin cancer images detection and classification on android devices using the MobileNet v2 deep learning model. We compare the performance of several aspects; object detection and classification method, computer and android based image analysis, image acquisition method, and setting parameter. Skin cancer actinic Keratosis and Melanoma are used to test the performance of the proposed method. Accuracy, sensitivity, specificity, and running time of the testing methods are used for the measurement. Based on the experiment results, the best parameter for the MobileNet v2 model on android using images from the smartphone camera produces 95% accuracy for object detection and 70% accuracy for classification. The performance of the android app for object detection and classification model was feasible for the skin cancer analysis. Android-based image analysis remains within the threshold of computing time that denotes convenience for the user and has the same performance accuracy with the computer for the high-quality images. These findings motivated the development of disease detection processing on android using a smartphone camera, which aims to achieve real-time detection and classification with high accuracy.
The improvement of uncertainty measurements accuracy in sensor networks based on fuzzy dempster-shafer theory Ehsan Azimirad; Seyyed Reza Movahhed Ghodsinya
International Journal of Advances in Intelligent Informatics Vol 6, No 2 (2020): July 2020
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v6i2.461

Abstract

Threat Assessment is one of the most important components in combat management systems. However, uncertainty is one of the problems that occur in the input data of these systems that have been provided using several sensors in sensor networks. In literature, there are some theories that state and model uncertainty in the information. One of the new methods is the Fuzzy Dempster-Shafer Theory. In this paper, a model-based uncertainty is presented in the air defense system based on the Fuzzy Dempster-Shafer Theory to measure uncertainty and its accuracy. This model uses the two concepts naming of the Fuzzy Sets Theory, and the Dempster-Shafer Theory. The input parameters to sensors are fuzzy membership functions, and the basic probability assignment values are earned from the Dempster-Shafer Theory. Therefore, in this paper, the combination of two methods has been used to calculate uncertainty in the air defense system. By using these methods and the output of the Dempster-Shafer theory are calculated and presented the uncertainty diagrams. The advantage of the combination of two theories is the better modeling of uncertainties. This makes that the output of the air defense system is more reliable and accurate. In this method, the air defense system’s total uncertainty is measured using the best uncertainty measure based on the Fuzzy Dempster-Shafer Theory. The simulation results show that this new method has increased the accuracy to 97% that is more computational toward other theories. This matter significantly increases the computational accuracy of the air defense system in targets threat assessment.
Mixture gaussian V2 based microscopic movement detection of human spermatozoa Ariyono Setiawan; I Gede Susrama Mas Diyasa; Moch Hatta; Eva Yulia Puspaningrum
International Journal of Advances in Intelligent Informatics Vol 6, No 2 (2020): July 2020
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v6i2.507

Abstract

Healthy and superior sperm is the main requirement for a woman to get pregnant. To find out how the quality of sperm is needed several checks. One of them is a sperm analysis test to see the movement of sperm objects, the analysis is observed using a microscope and calculated manually. The first step in analyzing the scheme is detecting and separating sperm objects. This research is detecting and calculating sperm movements in video data. To detect moving sperm, the background processing of sperm video data is essential for the success of the next process. This research aims to apply and compare some background subtraction algorithms to detect and count moving sperm in microscopic videos of sperm fluid, so we get a background subtraction algorithm that is suitable for the case of sperm detection and sperm count. The research methodology begins with the acquisition of sperm video data. Then, preprocessing using a Gaussian filter, background subtraction, morphological operations that produce foreground masks, and compared with moving sperm ground truth images for validation of the detection results of each background subtraction algorithm. It also shows that the system has been able to detect and count moving sperm. The test results show that the MoG (Mixture of Gaussian) V2 (2 Dimension Variable) algorithm has an f-measure value of 0.9449 and has succeeded in extracting sperm shape close to its original form and is superior compared to other methods. To conclude, the sperm analysis process can be done automatically and efficiently in terms of time.
Optimized COCOMO parameters using hybrid particle swarm optimization Noor Azura Zakaria; Amelia Ritahani Ismail; Nadzurah Zainal Abidin; Nur Hidayah Mohd Khalid; Afrujaan Yakath Ali
International Journal of Advances in Intelligent Informatics Vol 7, No 2 (2021): July 2021
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v7i2.583

Abstract

Software effort and cost estimation are crucial parts of software project development. It determines the budget, time, and resources needed to develop a software project. The success of a software project development depends mainly on the accuracy of software effort and cost estimation. A poor estimation will impact the result, which worsens the project management. Various software effort estimation model has been introduced to resolve this problem. COnstructive COst MOdel (COCOMO) is a well-established software project estimation model; however, it lacks accuracy in effort and cost estimation, especially for current projects. Inaccuracy and complexity in the estimated effort have made it difficult to efficiently and effectively develop software, affecting the schedule, cost, and uncertain estimation directly. In this paper, Particle Swarm Optimization (PSO) is proposed as a metaheuristics optimization method to hybrid with three traditional state-of-art techniques such as Support Vector Machine (SVM), Linear Regression (LR), and Random Forest (RF) for optimizing the parameters of COCOMO models. The proposed approach is applied to the NASA software project dataset downloaded from the promise repository. Comparing the proposed approach has been made with the three traditional algorithms; however, the obtained results confirm low accuracy before hybrid with PSO. Overall, the results showed that PSOSVM on the NASA software project dataset could improve effort estimation accuracy and outperform other models.
Evolution strategies based coefficient of TSK fuzzy forecasting engine Nadia Roosmalita Sari; Wayan Firdaus Mahmudy; Aji Prasetya Wibawa
International Journal of Advances in Intelligent Informatics Vol 7, No 1 (2021): March 2021
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v7i1.376

Abstract

Forecasting is a method of predicting past and current data, most often by pattern analysis. A Fuzzy Takagi Sugeno Kang (TSK) study can predict Indonesia's inflation rate, yet with too high error. This study proposes an accuracy improvement based on Evolution Strategies (ES), a specific evolutionary algorithm with good performance optimization problems. ES algorithm used to determine the best coefficient values on consequent fuzzy rules. This research uses Bank Indonesia time-series data as in the previous study. ES algorithm uses the popSize test to determine the number of initial chromosomes to produce the best optimal solution for this problem. The increase of popSize creates better fitness value due to the ES's broader search area. The RMSE of ES-TSK is 0.637, which outperforms the baseline approach. This research generally shows that ES may reduce repetitive experiment events due to Fuzzy coefficients' manual setting. The algorithm complexity may cost to the computing time, yet with higher performance.
A new family of kernels from the beta polynomial kernels with applications in density estimation Israel Uzuazor Siloko; Wilson Nwankwo; Edith Akpevwe Siloko
International Journal of Advances in Intelligent Informatics Vol 6, No 3 (2020): November 2020
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v6i3.456

Abstract

One of the fundamental data analytics tools in statistical estimation is the non-parametric kernel method that involves probability estimates production. The method uses the observations to obtain useful statistical information to aid the practicing statistician in decision making and further statistical investigations. The kernel techniques primarily examine essential characteristics in a data set, and this research aims to introduce new kernel functions that can easily detect inherent properties in any given observations. However, accurate application of kernel estimator as data analytics apparatus requires the kernel function and smoothing parameter that regulates the level of smoothness applied to the estimates. A plethora of kernel functions of different families and smoothing parameter selectors exist in the literature, but no one method is universally acceptable in all situations. Hence, more kernel functions with smoothing parameter selectors have been propounded customarily in density estimation. This article proposes a distinct kernel family from the beta polynomial kernel family using the exponential progression in its derivation. The newly proposed kernel family was evaluated with simulated and life data. The outcomes clearly indicated that this kernel family could compete favorably well with other kernel families in density estimation. A further comparison of numerical results of the new family and the existing beta family revealed that the new family outperformed the classical beta kernel family with simulation and real data examples with the aid of asymptotic mean integrated squared error (AMISE) as criterion function. The information obtained from the data analysis of this research could be used for decision making in an organization, especially when human and material resources are to be considered. In addition, Kernel functions are vital tools for data analysis and data visualization; hence the newly proposed functions are vital exploratory tools.
Identification of wood defect using pattern recognition technique Teo Hong Chun; Ummi Raba'ah Hashim; Sabrina Ahmad; Lizawati Salahuddin; Ngo Hea Choon; Kasturi Kanchymalay; Nur Haslinda Ismail
International Journal of Advances in Intelligent Informatics Vol 7, No 2 (2021): July 2021
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v7i2.588

Abstract

This study proposed a classification model for timber defect classification based on an artificial neural network (ANN). Besides that, the research also focuses on determining the appropriate parameters for the neural network model in optimizing the defect identification performance, such as the number of hidden layers nodes and the number of epochs in the neural network. The neural network's performance is compared with other standard classifiers such as Naïve Bayes, K-Nearest Neighbours, and J48 Decision Tree in finding their significant differences across the multiple timber species. The classifier's performance is measured based on the F-measure due to the imbalanced dataset of the timber species. The experimental results show that the proposed classification model based on the neural network outperforms the other standard classifiers in detecting many types of defects across multiple timber species with an F-measure of 84.01%. This research demonstrates that ANN can accurately classify the defects across multiple species while defining appropriate parameters (hidden layers and epochs) for the neural network model in optimizing defect identification performance.
Reversible difference expansion multi-layer data hiding technique for medical images Pascal Maniriho; Leki Jovial Mahoro; Zephanie Bizimana; Ephrem Niyigaba; Tohari Ahmad
International Journal of Advances in Intelligent Informatics Vol 7, No 1 (2021): March 2021
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v7i1.483

Abstract

Maintaining the privacy and security of confidential information in data communication has always been a major concern. It is because the advancement of information technology is likely to be followed by an increase in cybercrime, such as illegal access to sensitive data. Several techniques were proposed to overcome that issue, for example, by hiding data in digital images. Reversible data hiding is an excellent approach for concealing private data due to its ability to be applied in various fields. However, it yields a limited payload and the quality of the image holding data (Stego image), and consequently, these two factors may not be addressed simultaneously. This paper addresses this problem by introducing a new non-complexity difference expansion (DE) and block-based reversible multi-layer data hiding technique constructed by exploring DE. Sensitive data are embedded into the difference values calculated between the original pixels in each block with relatively low complexity. To improve the payload capacity, confidential data are embedded in multiple layers of grayscale medical images while preserving their quality. The experiment results prove that the proposed technique has increased the payload with an average of 369999 bits and kept the peak signal to noise ratio (PSNR) to the average of 36.506 dB using medical images' adequate security the embedded private data. This proposed method has improved the performance, especially the secret size, without reducing much the quality. Therefore, it is suitable to use for relatively big payloads.
Gabor-enhanced histogram of oriented gradients for human presence detection applied in aerial monitoring Anton Louise Pernez De Ocampo; Argel Bandala; Elmer Dadios
International Journal of Advances in Intelligent Informatics Vol 6, No 3 (2020): November 2020
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v6i3.514

Abstract

In UAV-based human detection, the extraction and selection of the feature vector are one of the critical tasks to ensure the optimal performance of the detection system. Although UAV cameras capture high-resolution images, human figures' relative size renders persons at very low resolution and contrast. Feature descriptors that can adequately discriminate between local symmetrical patterns in a low-contrast image may improve a human figures' detection in vegetative environments. Such a descriptor is proposed and presented in this paper. Initially, the acquired images are fed to a digital processor in a ground station where the human detection algorithm is performed. Part of the human detection algorithm is the GeHOG feature extraction, where a bank of Gabor filters is used to generate textured images from the original. The local energy for each cell of the Gabor images is calculated to identify the dominant orientations. The bins of conventional HOG are enhanced based on the dominant orientation index and the accumulated local energy in Gabor images. To measure the performance of the proposed features, Gabor-enhanced HOG (GeHOG) and other two recent improvements to HOG, Histogram of Edge Oriented Gradients (HEOG) and Improved HOG (ImHOG), are used for human detection on INRIA dataset and a custom dataset of farmers working in fields captured via unmanned aerial vehicle. The proposed feature descriptor significantly improved human detection and performed better than recent improvements in conventional HOG. Using GeHOG improved the precision of human detection to 98.23% in the INRIA dataset. The proposed feature can significantly improve human detection applied in surveillance systems, especially in vegetative environments.
Constructing decision rules from naive bayes model for robust and low complexity classification Nabeel Hashim Al-Aaraji; Safaa Obayes Al-Mamory; Ali Hashim Al-Shakarchi
International Journal of Advances in Intelligent Informatics Vol 7, No 1 (2021): March 2021
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26555/ijain.v7i1.578

Abstract

A large spectrum of classifiers has been described in the literature. One attractive classification technique is a Naïve Bayes (NB) which has been relayed on probability theory. NB has two major limitations: First, it requires to rescan the dataset and applying a set of equations each time to classify instances, which is an expensive step if a dataset is relatively large. Second, NB may remain challenging for non-statisticians to understand the deep work of a model. On the other hand, Rule-Based classifiers (RBCs) have used IF-THEN rules (henceforth, rule-set), which are more comprehensible and less complex for classification tasks. For elevating NB limitations, this paper presents a method for constructing a rule-set from the NB model, which serves as RBC. Experiments of the constructing rule-set have been conducted on (Iris, WBC, Vote) datasets. Coverage, Accuracy, M-Estimate, and Laplace are crucial evaluation metrics that have been projected to rule-set. In some datasets, the rule-set obtains significant accuracy results that reach 95.33 %, 95.17% for Iris and vote datasets, respectively. The constructed rule-set can mimic the classification capability of NB, provide a visual representation of the model, express rules infidelity with acceptable accuracy; an easier method to interpreting and adjusting from the original model. Hence, the rule-set will provide a comprehensible and lightweight model than NB itself.