Apple follows Google in banning Parler app

Hands holding a smartphone displaying the Parler logo
Olivier Douliery / Getty Images

Apple and Google banned the Parler social networking app from their respective app stores after Wednesday. Attack on US Capitol by Trump supporters. Parler full of violent comments Since the attack on the Capitol.

Apple pulled the app from the App Store on Saturday and said Parler did not properly publish police content by users.

Apple said in a statement that “it has always supported the representation of different perspectives on the App Store, but there is no room for threats of violence and illegal activities on our platform.” “Parler did not take adequate measures to prevent the proliferation of these threats to people’s security. We suspended Parler from the App Store until we resolved these problems.”

A day ago, Google removed Parler’s Android app from the Play Store and said it would be banned until Parler improved the moderation.

“We are aware that the posts are ongoing in the Parler app, which aims to incite ongoing violence in the US,” Google said in a statement Friday. “We know that there can be reasonable arguments about content policies and it can be difficult for apps to remove all infringing content immediately, but we require apps to enforce strong control of horrible content in order for us to distribute an app on Google Play.”


The App Store is the only way to distribute apps to iPhones, so the ban poses a serious challenge for online services. However, it can still be accessed via websites. In fact, browser manufacturers and web developers are developing a technology called progressive web applications (PWA), designed to give websites the full power of apps, especially on mobile devices.

While this feature is disabled by default, Google allows users to “side install” Android apps without going through the Play Store.

Banning practices is an example of “proliferation”, an attempt to curb disinformation, racist expressions, incentives to violence and other problematic communication. The modern internet provides multiple platforms to communicate directly with millions of people, and it has proven difficult to balance the benefits of online discussion with the disadvantages.

Apple sent Parler a reminder on Friday, according to Buzzfeed. “We have received numerous complaints about objectionable content on your Parler service, accusations that the Parler app was being used to plan, coordinate and facilitate illegal activities (among other things) that caused death in Washington DC on January 6, 2021. Numerous injuries and destruction of property. The app also seems to continue to be used to plan and facilitate more illegal and dangerous activities, “he told Parler. “If we do not receive an update in line with the App Store Review Guidelines and the requested audit improvement plan in writing within 24 hours, your app will be removed from the App Store.”

In a follow-up letter to Parler’s developers on Saturday, iPhone maker said he still saw unacceptable content on Parler.

“In your response, you stated that Parler has ‘taken this content very seriously for weeks’,” Apple wrote. “However, Parler’s processes to mitigate or prevent the spread of dangerous and illegal content were insufficient. In particular, we continued to find direct threats of violence and calls to encourage unlawful action.”

And an open plan put forward by Parler did not satisfy Apple.

Apple said, “Your answer also refers to an audit plan that does not meet the ‘currently’ ongoing requirements in the App Store’s guidelines.” While it’s not a perfect system to prevent all dangerous or hateful user content, apps can address these problems proactively and effectively. it must have solid content moderation plans to handle it. Given the widespread proliferation, a temporary ‘task force’ is not an adequate answer. harmful content. ”

Parler did not immediately respond to a request for comment on Apple’s ban on Saturday.

Chief Executive Officer John Matze challenged Apple’s position in a Parler post Friday, saying Apple did not hold Twitter or Facebook to the same standard. “They seem to believe that Parler is responsible for ALL user-generated content on Parler,” he said. “By the same logic, Apple should be responsible for ALL the actions of its phones. Every car bomb, every illegal cell phone call, every illegal crime committed on an iPhone, Apple should also be responsible.”

Apple did not respond to a request for comment on Matze’s words.

Read more: Intention Trump to be sacked for the second time? Where 25. Amendment To adapt?

Content printing on social media

The biggest example of deplatforming, Twitter suspends President Donald Trump permanentlyaccount “for the risk of further provoking violence.”

Twitter suspended President Donald Trump's Twitter account in January. 8, 2021.
Twitter permanently suspended President Donald Trump’s Twitter account on Friday.


Screenshot by Stephen Shankland / CNET

Following the deaths, vandalism and the property-damaging uprising in the Capitol Building – not to mention the insult to a national and international symbol of democracy – social media sites take a tougher stance against activities they deem dangerous. Facebook and Instagram blocked Trump from new posts “indefinite.” Reddit cut The_Donald, a large right-wing discussion forum and Twitter banned several high profile accounts right-wing, fake QAnon conspiracy theory linked.

In one Friday tweet, Rep. Alexandria Ocasio-Cortez, a prominent New York Democrat, He urged Google and Apple to take action After the calls for violence were reported in Parler.

Parler’s growing importance

Parler’s importance is growing for right-wing activists as he put kibosh on Trump’s social media accounts after Twitter, Facebook and Instagram were raided by Capitol loyalists on Wednesday.

“Our research revealed that Parler does not effectively monitor or remove content that promotes illegal activity and poses a serious risk to users’ health and safety by directly violating your terms of service,” Apple told Parler on Friday. a handful examples so called show violent threats. “This dangerous and harmful content is not suitable for the App Store. As you know from previous interviews with App Review, Apple requires that apps with user-generated content be effectively audited to ensure that objectionable, potentially harmful content is filtered out. Threatening content. It is never acceptable on the App Store for the good or for the purpose of inciting violence or other unlawful acts. ”

Leave a Reply

Your email address will not be published. Required fields are marked *