- Meta has outlined its approach to tackling electoral misinformation and addressing new challenges like generative AI in the upcoming elections.
- Key elements of Meta’s updated civic protection approach include requiring political advertisers to disclose the use of AI or digital techniques, blocking new political ads during the final week of the U.S. election campaign, and continuing to combat hate speech and coordinated inauthentic behavior.
- Meta has invested significantly in safety and security measures, with a large team working on fact-checking and moderation, and aims to facilitate relevant discussions without manipulation of its tools.
Meta’s Updated Approach to Tackling Electoral Misinformation
Meta has reiterated its commitment to addressing electoral misinformation and combating new challenges like generative AI in preparation for the upcoming elections. The company’s President of Global Affairs, Nick Clegg, has highlighted three key elements of Meta’s updated approach:
- Political advertisers will be required to disclose the use of AI or other digital techniques in creating or altering political or social issue ads. Failure to comply with this requirement may result in penalties.
- New political, electoral, and social issue ads will be blocked during the final week of the U.S. election campaign to prevent uncontestable claims. This measure is critical in avoiding deepfake content that could be used to mislead voters.
- Hate speech and coordinated inauthentic behavior will continue to be combated by Meta’s moderation teams. The company will remove the worst examples and label updates from state-controlled media to ensure transparency in political messaging.
Meta has significantly expanded its moderation efforts, particularly in relation to political influence and interference. The company has invested over $20 billion in safety and security measures and has built the largest independent fact-checking network among platforms. Meta aims to facilitate relevant discussions without manipulation of its tools.