Google and Apple are cracking down on the Parler social networking app two days after a. since before the attack on the Capitol.
Apple told Parler it’ll ban the social network’s app from its app store if Parler doesn’t start to moderate its content better, BuzzFeed reported Friday. Google removed Parler’s Android app from its Play Store on Friday, saying it’ll remain banished until Parler improves moderation.
“We’re aware of continued posting in the Parler app that seeks to incite ongoing violence in the US,” Google said in a statement Friday. “We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content.”
Apple sent Parler a warning letter Friday, Buzzfeed reported. “We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property. The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities,” Apple reportedly said to Parler. “If we do not receive an update compliant with the App Store Review Guidelines and the requested moderation improvement plan in writing within 24 hours, your app will be removed from the App Store.”
In a Parler post, Chief Executive John Matze challenged Apple’s position and said Apple doesn’t hold Twitter or Facebook to the same standard. “Apparently they believe Parler is responsible for ALL user generated content on Parler,” he said. “By the same logic, Apple must be responsible for ALL actions taken by their phones. Every car bomb, every illegal cell phone conversation, every illegal crime committed on an iPhone, Apple must also be responsible for.”
Apple didn’t immediately respond to a request for comment.
The App Store is the only way to distribute apps to iPhones, so banishment poses a serious challenge to online services. However, they often can still be reached through websites. Indeed, browser makers and web developers have been building technology called progressive web apps (PWAs) designed to give websites all the power of apps, particularly on mobile devices.
Google lets people “sideload” Android apps without going through its Play Store, though the ability is disabled by default.
Banning apps is an example of “deplatforming,” an attempt to curtail disinformation, racist remarks, incitements to violence and other problematic communications. The modern internet provides an abundance of platforms to directly communicate to millions of people, and it’s proved challenging to balance the benefits of online discussion with the drawbacks.
Read more: Will? Where does the fit in?
Content crackdown on social media
The biggest example of deplatforming happened Friday when‘s account “due to the risk of further incitement of violence.”
After the insurrection at the Capitol, which led to deaths, vandalism and property damage — not to mention the insult to a national and international symbol of democracy — social media sites have been taking a harder stance against activity they see as dangerous. “indefinitely.” , a major right-wing discussion forum, and associated with the right-wing, bogus QAnon conspiracy theory.
In a Friday tweet, Rep. Alexandria Ocasio-Cortez, a prominent New York Democrat, called for Google and Apple to take action after reported calls for violence on Parler.
Parler’s growing importance
Parler is growing in importance to right-wing activists as Twitter, Facebook and Instagram have put the kibosh on Trump’s social media accounts after loyalists stormed the Capitol on Wednesday.
“Our investigation has found that Parler is not effectively moderating and removing content that encourages illegal activity and poses a serious risk to the health and safety of users in direct violation of your own terms of service,” Apple reportedly told Parler, citing a handful of examples purportedly showing violent threats. “Content of this dangerous and harmful nature is not appropriate for the App Store. As you know from prior conversations with App Review, Apple requires apps with user generated content to effectively moderate to ensure objectionable, potentially harmful content is filtered out. Content that threatens the well being of others or is intended to incite violence or other lawless acts has never been acceptable on the App Store.”