Amazon, Apple, Google ban Parler app over violent content around Capitol attack
By Stephen Shankland, Edward Moyer, Ian Sherr
Amazon, Apple and Google have banned the Parler social networking app from their respective services and app stores in the wake of Wednesday's attack on the US Capitol by a mob of Trump supporters. Parler has been rife with violent comments since before the attack on the Capitol, and Apple and Google say they'll restore the app only when Parler moderates its service better.
Parler Chief Executive John Matze posted on his service late Saturday that Amazon had informed him it would no longer host his service on its Amazon Web Services platform. The move followed earlier announcements by Apple and Google that they removed the app from their respective app stores as well.
"This was a coordinated attack by the tech giants to kill competition in the marketplace," Matze wrote, adding that his service had become "too successful too fast." He didn't initially address his platform's comparatively lax moderation rules or its use by extremists ahead of the Capitol Hill riot. He also didn't mention increasing concerns that social media apps, including Parler, were being used to organize another attack in the coming weeks.
And on Sunday evening, Matze added in a press statement that the company is working to improve moderation to remove prohibited content such as posts that incite or threaten violence. "Parler strives to bring people together and find common ground, peace and healing. We do not condone or accept violence on our platform and we never will," Matze said.
Amazon didn't respond to a request for comment.
Apple, Google banish Parler app
Earlier on Saturday, Apple said in a statement that it had banned Parler from its App Store because it failed to appropriately police content posted by users.
Apple has "always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity," the company said. "Parler has not taken adequate measures to address the proliferation of these threats to people's safety. We have suspended Parler from the App Store until they resolve these issues."
The App Store is the only way to distribute apps to iPhones, so banishment poses a serious challenge to online services, though they can often still be reached through websites.
Apple's move followed Google's decision on Friday to remove Parler's Android app from its Play Store for similar reasons.
"We're aware of continued posting in the Parler app that seeks to incite ongoing violence in the US," Google said. "We recognize that there can be reasonable debate about content policies and that it can be difficult for apps to immediately remove all violative content, but for us to distribute an app through Google Play, we do require that apps implement robust moderation for egregious content."
Google's ban won't impact Parler as much as Apple's because Android users can "sideload" apps without going through the Play Store. The ability is disabled by default, however.
Deplatforming a platform
The modern internet provides an abundance of platforms to directly communicate to millions of people, and it's proved challenging to balance the benefits of online discussion with the drawbacks.
Matze had posted warnings his app might be removed from Amazon's web services after a group of employees called on the company to act. "We cannot be complicit in more bloodshed and violent attacks on our democracy," Amazon employees wrote in a tweet.
Less than a day later, they declared victory. "We demanded Amazon deplatform white supremacists using tech we work on as a bullhorn to incite violence and attack our democracy," the group said.
In Apple's case, the iPhone maker sent Parler a warning letter on Friday, according to Buzzfeed, demanding the app improve its moderation.
"We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property. The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities," Apple reportedly said to Parler. "If we do not receive an update compliant with the App Store Review Guidelines and the requested moderation improvement plan in writing within 24 hours, your app will be removed from the App Store."
In a follow-up letter Saturday to Parler's developers, Apple said it was still seeing unacceptable content on Parler.
"In your response, you referenced that Parler has been taking this content 'very seriously for weeks,'" Apple wrote. "However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action."
And an apparent plan put forward by Parler didn't satisfy Apple.
"Your response also references a moderation plan 'for the time being,' which does not meet the ongoing requirements" in the App Store's guidelines, Apple wrote. "While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary 'task force' is not a sufficient response given the widespread proliferation of harmful content."
Parler didn't respond to a request for comment on Apple's ban either.
In a Parler post on Friday, Matze challenged Apple's position and said Apple doesn't hold Twitter or Facebook to the same standard. "Apparently they believe Parler is responsible for ALL user generated content on Parler," he said. "By the same logic, Apple must be responsible for ALL actions taken by their phones. Every car bomb, every illegal cell phone conversation, every illegal crime committed on an iPhone, Apple must also be responsible for."
Apple didn't respond to a request for comment on Matze's remarks.
Content crackdown on social media The biggest example of deplatforming happened Friday when Twitter permanently suspended Trump's account "due to the risk of further incitement of violence."
Twitter permanently suspended President Donald Trump's Twitter account on Friday.
After the insurrection at the Capitol, which led to deaths, vandalism and the evacuation of Congress -- not to mention the insult to a national and international symbol of democracy -- social media sites have been taking a harder stance against activity they see as dangerous. Facebook and Instagram blocked Trump from new posts "indefinitely." Reddit cut off The_Donald, a major right-wing discussion forum, and Twitter banned several high-profile accounts associated with the right- wing, bogus QAnon conspiracy theory.
In a Friday tweet, Rep. Alexandria Ocasio-Cortez, a prominent New York Democrat, called for Google and Apple to take action after reported calls for violence on Parler.
Parler's growing importance
Parler is growing in importance to right-wing activists as Twitter, Facebook and Instagram have put the kibosh on Trump's social media accounts after loyalists stormed the Capitol on Wednesday.
"Our investigation has found that Parler is not effectively moderating and removing content that encourages illegal activity and poses a serious risk to the health and safety of users in direct violation of your own terms of service," Apple reportedly told Parler on Friday, citing a handful of examples purportedly showing violent threats. "Content of this dangerous and harmful nature is not appropriate for the App Store. As you know from prior conversations with App Review, Apple requires apps with user generated content to effectively moderate to ensure objectionable, potentially harmful content is filtered out. Content that threatens the well being of others or is intended to incite violence or other lawless acts has never been acceptable on the App Store."