Introduction: Digital radiography has become one of the most important diagnostic techniques in the field of radiology, offering advantages in terms of image quality and efficiency. Objective: This study aims to analyze the effect of tube voltage (kV) and milliampere second (mAs) variations on contrast and image density in digital radiographic systems. The selection of the right exposure factor is a fundamental principle for radiographers to produce quality images. Methods: This study uses an experimental method with two variables: tube voltage variations (46 kV, 50 kV, 55 kV, 60 kV, and 66 kV) with fixed mAs values (8 mAs), and mAs variations (2 mAs, 4 mAs, 6 mAs, 8 mAs, and 10 mAs) with fixed tube voltages (50 kV). Result: The study showed that the variation of tube tension had a significant effect on the density value of the image. A tube voltage of 46 kV produces a minimum density of 0.02 and a maximum of 3.22, while a 66 kV produces a minimum density of 1.88 and a maximum of 2.68. The density increases as the tube tension increases, but the contrast of the image tends to decrease, following an exponential relationship with a correlation value of R² = 0.90. In contrast, the mAs variation shows that even though the density increases linearly, the contrast of the image remains unaffected. Conclusion: This study emphasizes the importance of understanding the regulation of exposure factors, where tube voltage has a greater effect on image quality than mAs. This study provides recommendations for optimal regulation in radiography practice to achieve better results
Copyrights © 2022