Navigating complex educational websites poses challenges for users looking for specific information. This research discusses the problem of efficient information search on closed-domain educational platforms, focusing on the Universitas Indonesia website. Leveraging Natural Language Processing (NLP), we explore the effectiveness of transfer learning models in Closed Domain Question Answering (QA). The performance of three BERT-based models, including IndoBERT, RoBERTa, and XLM-RoBERTa, are compared in transfer and non-transfer learning scenarios. Our result reveals that transfer learning significantly improves QA model performance. The models using transfer learning scenario showed up to 4.91\% improvement in the F-1 score against those using non-transfer learning scenario. XLM-RoBERTa base outperforms all other models, achieving the F-1 score of 61.72\%. This study provides valuable insights into Indonesian-language NLP tasks, emphasizing the efficacy of transfer learning in improving closed-domain QA on educational websites. This research advances our understanding of effective information retrieval strategies, with implications for improving user experience and efficiency in accessing information from educational websites.