396 with 156 posters participating, including story author
Amazon Web Services is suspending Parler’s access to its hosting services at the end of the weekend, potentially driving the service offline unless it can find a new provider.
“Because Parler cannot comply with our terms of service and poses a very real risk to public safety, we plan to suspend Parlers account effective Sunday, January 10th, at 11:59PM PST,” Amazon wrote to Parler in an email obtained and first reported by BuzzFeed.
The email from AWS to Parler cites several examples of violent and threatening posts made in recent days, including threats to “systematically assassinate liberal leaders, liberal activists, BLM leaders and supporters,” and others. “Given the unfortunate events that transpired this past week in Washington, D.C., there is serious risk that this type of content will further incite violence,” the message adds.
Parler launched in 2018 as a “free speech” alternative to Twitter and Facebook. Through 2019 and 2020, it drew a number of conservative, right-wing, and far-right fringe users. Usage has dramatically increased in the past few days in the wake of Wednesday’s events at the US Capitol and President Donald Trump’s subsequent total ban from Twitter and other platforms.
That increased traffic has also brought increased threats of violence to the platform, which technology companies across the board seem to be taking more seriously after this weekand no wonder, as the insurrectionists who attacked the Capitol made widespread use of social media to plan, carry out, and brag about their activity.
Advertisement
Parler, however, has not articulated a clear plan for dealing with violent threats on its platform. As Amazon wrote:
It’s clear that Parler does not have an effective process to comply with the AWS terms of service. It also seems that Parler is still trying to determine its position on content moderation. You remove some violent content when contacted by us or others, but not always with urgency. Your CEO recently stated publicly that he doesnt “feel responsible for any of this, and neither should the platform.” This morning, you shared that you have a plan to more proactively moderate violent content, but plan to do so manually with volunteers. Its our view that this nascent plan to use volunteers to promptly identify and remove dangerous content will not work in light of the rapidly growing number of violent posts.
Apple also removed Parler from its iOS App Store earlier today, citing similar concerns.
“Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines,” Apple wrote. “Your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.”
Google already booted Parler from its app store on Friday, also citing the prevalence of explicitly violent content left up on the platform.

You may also like