Meta's New Political Ad Disclosure Policy May Aid India's Crackdown On Deepfakes, Misinformation
Meta's New Political Ad Disclosure Policy May Aid India's Crackdown On Deepfakes, Misinformation
Meta says advertisers must disclose when a social issue, election, or political ad on Facebook or Instagram has been “digitally created or altered,” or by AI.

Rajeev Chandrasekhar, India’s Minister of State for Electronics and Information Technology, earlier this week, called out social media platforms for their inability to handle content containing deepfakes, and warned them of consequences if they did not remove reported fake information within 36 hours of being notified.

This came after the recent deepfake controversy involving actress Rashmika Mandanna. A deepfake of her was circulated on social media, and several big-name celebrities came out in support of the actress and against AI-driven deepfakes.

Now, while it could be a mere coincidence, but coupled with the deepfake controversy, being called out by the Indian government and with upcoming state elections in Rajasthan, Madhya Pradesh and other states, Meta has taken action, and announced that from new year onwards, it is going to take steps to disclose when a social issue, election, or political advertisement on Meta platforms like Facebook or Instagram has been “digitally created or altered,” or using AI. This policy will be applicable globally. 

“Advertisers will have to disclose whenever a social issue, electoral, or political ad contains a photorealistic image or video, or realistic sounding audio, that was digitally created or altered,” Meta said. Anything that depict a real person as saying or doing something they did not say or do, or depicting a realistic-looking person that does not exist or a realistic-looking event that did not happen, or altering footage of a real event that happened, and portrayal of a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event—falls under this order.

Further, Meta added that advertisers may not disclose digital changes in an ad if the modifications are minor and do not significantly alter the ad’s message. These minor adjustments could be resizing an image, cropping it, or detailing  elements like color or brightness. However, if the alterations are big and do affect the message, advertisers are required to report these changes.

If an advertiser doesn’t disclose all the relevant info, Meta will reject their ad. And if they keep doing it, Meta can impose penalties.

Indian Government’s ‘Legal’ Reminder

It cannot be ascertained whether these changes are a result of the ongoing AI-driven dissemination of misinformation in India and globally, whether through recent instances of deepfakes or the Israel-Hamas conflict. However, it is evident that the Indian government is enforcing stricter regulations on social media platforms, in accordance with India’s new IT rules introduced in April 2023.

On Monday, earlier this week, Rajeev Chandrasekahr served a reminder to platforms that they must “ensure no misinformation is posted by any user,” and also must “ensure that when reported by any user or govt, misinformation is removed in 36 hrs.” If any platform fails to comply with these mandates, “rule 7 will apply and platforms can be taken to court by aggrieved person under provisions of IPC.”

Original news source

What's your reaction?

Comments

https://tupko.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!