Introduction: The high-power consumption of computing devices poses both economic and environmental challenges in the digital era. This study aims to optimize power usage using machine learning to maintain device performance while reducing energy costs and carbon emissions. Methods: The Random Forest algorithm was selected for its robustness in handling non-linear interactions among features. A dataset containing historical power consumption, workload metrics, environmental conditions, and hardware configurations was collected from sensors and logs. Data pre-processing included cleaning, normalization, and feature selection. The model was trained and evaluated using accuracy, precision, recall, F1-score, MAE, and RMSE metrics. Hyperparameter tuning via grid search, random search, and Bayesian optimization was applied to enhance model performance. The model was deployed on real devices to test energy optimization under varied workloads. Results: The Random Forest model achieved 92% accuracy and an RMSE of 0.15. Tuning reduced RMSE by 10% and improved F1-score from 0.875 to 0.905. Implementation on computing devices led to average power savings of 15–20% across workload scenarios without notable performance degradation (<5%). The model also projected annual carbon emission reductions of up to 5 tons of CO₂ and operational savings of $50,000 when scaled to 1,000 servers. Conclusions: Machine learning, particularly Random Forest, proves effective in optimizing power consumption on computing devices. The proposed approach not only ensures computational efficiency but also promotes environmental sustainability. These findings support further exploration of ML-based solutions for green technology initiatives in IT infrastructure.