Essay exams are often an option to evaluate a person's understanding and interpretation of the material they have studied, rather than simply testing knowledge or understanding through essay exams that still rely on manual methods. The disadvantage of manual assessment is that it is prone to errors due to variations between examiners in providing assessments, coupled with the number of questions that must be assessed which is sometimes quite large, so it takes significant time. So this research carried out the development of an application with an automatic answer correction model with cosine similarity to measure how similar or how far two vectors are in multidimensional space. The result, system's potential in educational contexts was demonstrated by testing it on essay responses. To improve accuracy and usability, future developments could use sophisticated text scoring algorithms and provide more features. This study highlights the importance of automated grading systems for optimizing essay scoring in educational settings while maintaining scalability and reliability. Keywords: essay exams, automated grading, essay checker, cosine similarity
Copyrights © 2024