This study examines Facebook's role in the Rohingya genocide in Myanmar, highlighting regulatory gaps that enable social media misuse. Using a qualitative case study approach, it explores how Facebook's algorithm, prioritizing user engagement, amplifies misinformation and hate speech, fueling ethnic violence. Key regulatory gaps include Facebook's inadequate local content moderation, Myanmar's weak legal oversight, and the absence of robust international regulations. Despite civil society's watchdog role, limited resources hinder effectiveness. The study underscores the urgent need for digital governance reforms to address these gaps, curb hate speech, and protect vulnerable communities from social media's harmful impacts.
Copyrights © 2025