Content moderation refers to the process of monitoring, reviewing, and managing user-generated content on platforms to ensure compliance with community guidelines and legal requirements. This tag discusses the methodologies, ethical dilemmas, and ongoing debates about censorship, bias, and the role of human moderators versus automated systems.