This study examines the legal liability of digital platforms in the dissemination of illegal content through a normative legal analysis, with a particular focus on reconstructing a framework that effectively protects citizens’ rights. The rapid growth of digital platforms such as Google, Meta Platforms, and TikTok has intensified challenges related to intermediary responsibility, as these platforms increasingly function not only as passive conduits but also as active amplifiers of information. Existing liability regimes, primarily based on notice-and-takedown mechanisms, are found to be insufficient in addressing the scale and complexity of illegal content dissemination. Using primary, secondary, and tertiary legal materials, this research analyzes current regulatory frameworks and identifies key gaps, particularly in terms of legal clarity, enforcement effectiveness, and the protection of fundamental rights. The study highlights the tension between safeguarding freedom of expression and ensuring protection against harmful content, noting that both under-regulation and over-regulation pose risks to citizens’ rights. Comparative insights, including developments under the Digital Services Act, demonstrate a shift toward more proactive and structured approaches to platform accountability. The study proposes a reconstructed liability framework based on the principles of proportionality, due diligence, transparency, and effective remedy. This model seeks to balance platform accountability with the protection of fundamental rights, offering a more adaptive and rights-oriented approach to digital governance. The findings contribute to the development of legal policies that align technological advancement with the protection of citizens in the digital era.
Copyrights © 2026