Since the present data expansion and increase are occurring at an increasingly rapid pace, the solution of adding storage space is not sustainable in the long run. The growing need for storage media can be addressed with lossless compression, which reduces stored data while allowing complete restoration. Huffman remains a potent method for data compression, functioning as a "back end" process and serving as the foundational algorithm in applications, among others, Monkey's PKZIP, WinZip, 7-Zip, and Monkey's Audio. Lossless compression of 16-bit audio requires binary structure adjustments to balance speed and optimal compression ratio. The use of a 4-ary Huffman tree (4-ary) branching procedure to generate binary code generation and to insert a maximum of 2 dummy data symbol variables that are given a binary value of 0 with the condition that if the number of MOD 3 data variables = remaining 2, then two dummy data are added, if the result is the remainder 0 = 1 dummy data, and if the remainder = 1 then it is not required. This process effectively maintains a high ratio level while speeding up the 4-ary Huffman code algorithm's performance in compression time. The results show that the efficiency reaches 95.94%, the ratio is 38%, and the comparison is 1/3 of the Level based on calculations, testing, and comparison with other generations of the Huffman code. The 4-ary algorithm significantly optimizes archived data storage, reducing redundancy to 0.124 and achieving an entropy value of 2.91 across various data types.