The rapid advancement of artificial intelligence (AI) technology has triggered a significant surge in computational resource consumption, particularly in the inference phase driven by text prompting. This study aims to evaluate the resource efficiency of various AI prompting techniques based on Green Computing principles, which emphasize minimizing energy consumption and maximizing the computational sustainability of digital systems. The research employs a qualitative descriptive literature review method, collecting data from 15 scientific publications sourced from Google Scholar, IEEE Xplore, and Semantic Scholar databases, covering the period 2020–2025. The analysis demonstrates that prompting technique selection directly influences token count, response latency, and energy consumption. Zero-shot prompting exhibits the lowest energy footprint (0.38 mWh per query) while Chain-of-Thought and ReAct prompting, although yielding higher accuracy (91–93%), consume up to five times more energy. The findings indicate that applying prompt compression, context pruning, and model-right-sizing strategies, aligned with Green Computing principles can reduce energy usage by 40–60% without substantially sacrificing output quality. This research contributes a practical framework for designing energy-aware AI prompting strategies suitable for sustainable computing environments.
Copyrights © 2026