views
Google is going after advertisers who continue to promote apps that can generate nude images. The company has always followed a strict policy against these ads but from this month onwards, Google is ready to heavily penalise them by blocking the content and apps that promote these services. The changes are going to be applicable from the end of May after which ads of this nature will be directly removed from the Play Store.
Google Blocking Ads From May 30: What The Rule Says
Google has tweaked its content and app store policies in recent times and more efforts are required to tackle the modern issues. AI-generated images and content are spreading like a wildfire these days and companies like Apple and Google are trying their best to thwart the risk posed by altered content.
The iPhone giant is also occupied with handling apps of this nature and have even talked about taking down some of these apps involved in nefarious activities. Apple has reportedly taken down multiple apps that can be used to create AI-generated explicit images. In fact, reports claim these apps were able to create explicit AI images of a person without taking their consent to do so.
Apps like these are flouting all basic rules and tempting users with the promise of AI-based editing, face swaps and more. Apple discovered some of these apps on the App Store that were being marketed through ads on Instagram.
Google’s new ruling works to the same effect but the company is going to rely on a mix of human reviews and automation to clean up the act which it claims to have resulted in removal of 1.8 billion ads for violating some of its content policies. Apps that help with AI-generated explicit images tend to market themselves as a clean platform to avoid scrutiny from Google but are generally promoting explicit content on other platforms.
Comments
0 comment