This study discusses the implementation of Transfer Learning using the AlexNet architecture for classifying Yogyakarta Batik motifs, specifically Kawung, Parang, and Truntum. The dataset consists of 1,110 batik images that underwent preprocessing, augmentation, and data splitting. The research was conducted in two main experimental scenarios: applying Average Pooling and Max Pooling with a more complex classifier, and training the model without additional pooling layers using a simpler classifier. Furthermore, the experiments compared the model’s performance across different frozen layers and two optimizers (Adam and SGD). The results show that the best configuration was obtained using the SGD optimizer with Average Pooling and three frozen layers, achieving a test accuracy of 99.10%. In contrast, the Adam optimizer tended to produce lower and less stable performance. Experiments without pooling also reached high accuracy, but were less optimal than those with pooling. Therefore, this study demonstrates that the choice of pooling technique, classifier complexity, frozen layers, and optimizer plays a crucial role in achieving optimal performance of AlexNet for Batik classification.