Purpose – The increasing adoption of web-based E-learning platforms has introduced significant performance challenges under high user load, including elevated latency, degraded response time, and service downtime. While web caching and load balancing are commonly employed as mitigation strategies, empirical guidance on which technique better suits E-learning traffic characteristics remains limited. This study aims to provide a direct, controlled comparison between the two approaches.Design/methods/approach – An experimental methodology was employed using a Moodle-based E-learning platform deployed in a controlled virtualized environment. Web caching was implemented with Varnish Cache and load balancing with HAProxy, each deployed on separate virtual machines running Moodle as the E-learning platform. Apache JMeter was used to simulate concurrent workloads ranging from 50 to 250 users, measuring latency, throughput, server response time, and failure rate.Findings – The results show that web caching consistently outperformed load balancing across all metrics. At 250 concurrent users, web caching reduced latency by 9.9%, improved throughput by approximately 4.5 times, decreased server response time by 96.6%, and lowered failure rate by 80.2% compared to load balancing. These findings indicate that caching more effectively mitigates backend overload in E-learning systems dominated by repetitive content access.Research implications/limitations – The experiments were conducted in a virtualized environment and focused primarily on static and semi-static content. Hybrid architecture combining both techniques were not evaluated.Originality/value – This study provides a head-to-head empirical comparison between web caching and load balancing under identical conditions, offering practical architectural guidance for infrastructure planning in academic environments.
Copyrights © 2026