This paper introduces a novel approach to batik image representation using the texton-based and statistical Multi Texton Co-occurrence Histogram (MTCH). The MTCH framework is leveraged as a robust batik image descriptor, capable of encapsulating a comprehensive range of visual features, including the intricate interplay of color, texture, shape, and statistical attributes. The research extensively evaluates the effectiveness of MTCH through its application on two well-established public batik datasets, namely Batik 300 and Batik Nitik 960. These datasets serve as benchmarks for assessing the performance of MTCH in both classification and image retrieval tasks. In the classification domain, four distinct scenarios were explored, employing various classifiers: the K-Nearest Neighbors (K-NN), Support Vector Machine (SVM), Decision Tree (DT), and Naïve Bayes (NB). Each classifier was rigorously tested to determine its efficacy in correctly identifying batik patterns based on the MTCH descriptors. On the other hand, the image retrieval tasks were conducted using several distance metrics, including the Euclidean distance, City Block, Bray Curtis, and Canberra, to gauge the retrieval accuracy and the robustness of the MTCH framework in matching similar batik images. The empirical results derived from this study underscore the superior performance of the MTCH descriptor across all tested scenarios. The evaluation metrics, including accuracy, precision, and recall, indicate that MTCH not only achieves high classification performance but also excels in retrieving images with high similarity to the query. These findings suggest that MTCH is a highly effective tool for batik image analysis, offering significant potential for applications in cultural heritage preservation, textile pattern recognition, and automated batik classification systems.
Copyrights © 2024