With the rapid advancement of scientific knowledge, the way people access information has also changed significantly. The need for information is now met not only through print media but also through electronic media such as e-books. To address issues related to the large size of digital files, such as PDFs, data compression techniques become crucial. Compression is the process of reducing the size of data to be smaller than its original size, making large files with many repeated characters more efficient for storage. This research aims to evaluate the performance of various compression algorithms by measuring the results of compressing PDF files according to specific parameters. The study shows that the Levenstein algorithm generally performs better in producing smaller file sizes compared to the Taboo Codes algorithm.
Copyrights © 2024