Scholarly databases operate as everyday workspaces for discovery, screening, and citation management, so small usability frictions can compound into substantial research overhead. This study compares the user experience (UX) of five widely used platforms: Web of Science, Scopus, Google Scholar, DBLP, and SciProfiles. UX is treated as task success in realistic research workflows and confidence in record quality rather than surface level visual design. An integrated evaluation framework combines ISO 9241 usability principles, a ten-heuristic checklist, and a UX honeycomb model, and operationalizes them into eight dimensions and 30 criteria (maximum score 120). Unlike unweighted checklists, the framework uses Analytic Hierarchy Process (AHP) weighting to make trade offs among UX dimensions explicit and to support consistent cross platform benchmarking. Criteria cover search and relevance cues, metadata and export reliability, interface consistency, mobile responsiveness, accessibility, and credibility signals. Platforms were assessed through task based testing and a structured review of platform guidance and user feedback. Weighted scores were normalized for comparison. Results show a likely advantage for subscription based systems in precision search and metadata handling, with Web of Science ranking highest, followed by Scopus and Google Scholar, while DBLP and SciProfiles score lower yet remain useful for niche needs such as open metadata access and profile oriented discovery. The framework can be reused as a rubric for training, platform selection, and periodic UX audits.
Copyrights © 2026