Apple kicks Parler out of the App Store - The Entrepreneurial Way with A.I.


Saturday, January 9, 2021

Apple kicks Parler out of the App Store


Apple has removed Parler from the App Store for failing to moderate its community. On January 8th, Apple gave the company, which operates a social media network popular with Donald Trump supporters and the far-right, 24 hours to take a tougher stance on content the company said “encourages illegal activity.”

“We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity,” a spokesperson for the company told Engadget. “Parler has not taken adequate measures to address the proliferation of these threats to people’s safety. We have suspended Parler from the App Store until they resolve these issues.”

The move comes after Google suspended the app from the Play Store on Friday. Like Apple, Google cited the fact Parler allowed its users to continue to call for violence for its decision. Despite the suspension, both Android and iOS users can continue to use Parler, provided they already have the app installed on their device.

Following the decision, Apple sent Parler the letter below.

To the developers of the Parler app,

Thank you for your response regarding dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.

Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines.

In your response, you referenced that Parler has been taking this content “very seriously for weeks.” However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 - Safety - Objectionable Content.

Your response also references a moderation plan “for the time being,” which does not meet the ongoing requirements in Guideline 1.2 - Safety - User Generated content. While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary “task force” is not a sufficient response given the widespread proliferation of harmful content.

For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.


App Review Board


via January 9, 2021 at 08:54PM by , Khareem Sudlow,