A fundamental issue in the educational ecosystem is the assessment of learning outcomes. In preparing teachers for the Teacher Professional Education Competency Test (UKPPPG), tryout tests serve as strategic instruments. For the Marketing Education Program, these test items must reflect required pedagogical and professional competencies, particularly Pedagogical Content Knowledge (PCK), which integrates content mastery with teaching strategies. This study aims to analyze the quality of tryout test items based on PCK in preparing Marketing Education Program teachers for the UKPPPG. Using a quantitative-descriptive approach with 108 participants (36 male; 72 female), data were analyzed with SPSS 31, focusing on item validity (point-biserial correlation), reliability (Cronbach's α), discrimination index (D), difficulty level (p), and distractor effectiveness. Results indicate sufficient reliability (α = 0.760), though item quality varied: some items were valid, marginal, or invalid—particularly weak in early indicators. Discrimination indices were mostly "fair," with several items rated "good" (D ≥ 0.40) suitable for retention, while "poor" items (D < 0.20) require revision or replacement. The difficulty distribution was unbalanced for summative testing (easy 34%, medium 37%, difficult 29%), suggesting dominance of easy items that reduce discrimination. Distractor analysis revealed an average of 2–3 functioning distractors per item (66%), though some were implausible and required revision. The implications highlight the need for systematic selection and revision of items (stem, key, and distractors), rebalancing difficulty levels, and repeated pilot testing to ensure the instrument achieves higher validity, reliability, and representativeness of PCK-based teacher competencies in 21st-century marketing education.