Recommended actions
- For technology platforms and providers: Prepare systems to comply with COMELEC takedown requests and enhance monitoring for violations related to AI, bots and disinformation on your platforms.
- For candidates and political parties: Register all digital campaign platforms (e.g., social media accounts, websites, podcasts) with the COMELEC by 13 December 2024.
- Ensure transparency: Disclose AI use in campaign materials and include clear, visible disclaimers.
- Audit your content to prevent the use of bots, fake accounts or AI-generated media that spreads disinformation.
In depth
The COMELEC's Resolution No. 11064 introduces a new framework to regulate digital election campaigns and counter disinformation on social media and other digital platforms. These guidelines affect both political candidates and technology platforms and providers hosting election-related content.
Platform registration
All official digital campaign platforms (e.g., social media accounts, websites, podcasts) must be registered with the COMELEC within 30 calendar days after the filing of the certificates of candidacy, or by 13 December 2024. This requirement applies to candidates, political parties and third-party entities managing digital campaign platforms that promote or oppose candidates.
Failure to register could lead to takedown requests, criminal penalties or disruption of campaign activities.
Technology platforms and providers, such as social media companies, may be required to assist in ensuring that unregistered digital campaign platforms are removed, taken down or blocked upon the COMELEC's request.
Transparency in AI use
Campaign materials that make use of AI technology must include disclosures or disclaimers, which shall be clear, conspicuous and not easily removed or altered, and must remain visible or audible for a sufficient duration both before and after the campaign material is presented. These disclosures must specify the nature of the AI's involvement, including how the content was manipulated. It must likewise include a statement confirming that prior consent has been obtained from all individuals depicted in the AI-manipulated campaign material.
Additionally, the COMELEC mandates the use of digital technologies (e.g., watermarks) designed to ensure the authenticity and provenance of the digital content.
Technology platforms and providers are required to ensure proper disclosure once they have been notified by the COMELEC that an election-related content has been AI-manipulated. They may also need to respond to COMELEC requests to remove noncompliant content.
Failure to disclose AI use could result in takedowns, fines or legal action against the responsible parties.
Prohibited acts and penalties
The COMELEC has also outlined several actions that will be considered violations of election law, namely:
- Use of "false amplifiers" (e.g., fake accounts, bots) to spread disinformation and misinformation in endorsing or campaigning against a candidate, a political party/coalition or party-list organizations, or to propagate disinformation and misinformation targeting the Philippine election system, the COMELEC and electoral processes during the election and campaign period
- Coordinated inauthentic behavior and utilization of hyperactive users for the abovementioned purposes
- Creation and dissemination of deepfakes, cheapfakes and soft fakes for the abovementioned purposes
- Use of fake and unregistered social media accounts during the election and campaign period for the abovementioned purposes
- Creation and dissemination of fake news in furtherance of the abovementioned purposes
- Use of content produced through AI technology but without compliance with the transparency and disclosure requirements under the resolution
Key risks for noncompliance (for technology platforms and providers)
- Takedown requests for noncompliant content or digital campaign platforms, which could affect platform operations during the election period
- Legal liabilities for failing to act on the COMELEC's takedown orders or enabling disinformation campaigns
- Reputational damage from hosting or failing to detect disinformation and violations, impacting user trust
Next steps for immediate compliance
- For technology platforms and providers:
- Strengthen content monitoring systems for AI-generated media, bots and coordinated disinformation.
- Prepare for potential takedown requests from the COMELEC by establishing streamlined response protocols.
- Ensure that platform terms of service are updated to align with the COMELEC's guidelines on election-related content transparency.
- For candidates and political parties:
- Register your digital campaign platforms with the COMELEC by 13 December 2024.
- Review and disclose any AI-generated content in campaign materials.
- Ensure that your digital content complies with the COMELEC's disinformation regulations to avoid takedown requests or penalties.
Key takeaways:
- Compliance is critical for both candidates and technology platforms and providers. Register digital platforms, disclose AI use and monitor content for disinformation.
- Technology platforms and providers must be prepared to respond to the COMELEC's takedown requests and implement robust monitoring systems for election-related content.
- Avoid legal risks: Both candidates and technology platforms and providers must ensure that they comply with the COMELEC's new rules to avoid fines, takedown requests or criminal charges
Ensuring compliance with these new regulations is essential to avoid legal risks and protect the integrity of digital platforms and political campaigns.
For further guidance on these changes and how they may affect your operations, please refer to the Contact Us section to reach out to our team of legal experts at Quisumbing Torres.
* * * * *

Please contact QTInfoDesk@quisumbingtorres.com for inquiries.
VISIT QUISUMBING TORRES SITE