Stripping the altars, burning the heretics and cleansing the stables are the usual fare of a morally crazed order. The engaged agents think they have found the reason for existence and need to bother everybody else about it. Some of this can be a very dangerous thing indeed – at least historically.
Those who claim to know the truth are the very sort who are happy to fill the morgues, ban the theatres and destroy musical instruments. But when it comes to matters of social media, we are dealing with more mediocre fare. Currently, there is a spurting, childish wonder at the moves by YouTube to excise, cut and move the stuff that might be considered naughty, offensive, indoctrinating and what not.
A burning issue centres on weeding out milk white supremacist content, merely another part of the recent surge against what might be described as extremist content (these terms remain infuriatingly vague). The Christchurch pledge, an understanding reached by heads of state and Silicon Valley tech giants last month to target such unsavoury content, proved catalytic.
On June 5, YouTube announced that it was “specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, caste, religion, sexual orientation or veteran status.” Videos promoting or glorifying Nazi ideology are furnished as examples that will be removed including content “denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.”
An email from YouTube acknowledged that this would come with its problems:
We know this might be disappointing, but it’s important to us that YouTube is a safe place for all. If content breaks our rules, we remove it.”
This measure has already led to the removal or hiding of videos that document hate speech and activities for journalistic and educational purposes.
Scott Allsop, a history teacher based in Romania, had his channel banned for hosting archival footage featuring Nazi propaganda, including newsreel footage of Hitler’s speeches.
Organisations with anti-racist platforms, such as the One People’s Project, have also faced the removal of information videos designed to discuss and combat racism. Never say that makers of digital platforms cannot be ironic.
YouTube did acknowledge in these changes that “some of this content has value to researchers and NGOs looking to understand hate in order to combat it”.
This, as ever, remains the problem when a platform dictates the content of the conversation and what is permitted to circulate in it.
Heidi Beirich, director of the Southern Poverty Law Centre’s Intelligence Project mentions the obvious technical problem.
It indicates that [YouTube] have not refined well enough the difference between someone who is exploring issues of racism and hatred and someone who’s promoting it.”
The censoring mentality is not merely dangerous towards content, but attitude. It encourages contrived fragility and the aggrieved to strike back at the disagreeable and unpleasant, not in terms of ideas but in terms of neutralisation.
Carlos Maza of Vox, for instance, is making a case that the platform should be rigorously structured and censored to avoid offence on matters of sexual orientation, race and gender. He naturally makes the case that he has “pretty thick skin when it comes to online harassment” but proceeds to claim to being so bothered, he needs YouTube to take action.
Such skin, it seems, wears thin in battle, peeling off before the silly, the absurd and even the obscene. “Since I started working at Vox,” tweeted Maza, “Steven Crowder has been making video after video ‘debunking’ Strikethrough. Every single video has included repeated, overt attacks on my sexual orientation and ethnicity.”
Crowder is not to be encouraged, but nor should Maza in his thin-skinned, boiled down courage. Both should be allowed to wallow in the necessary shallowness that YouTube permits, a sort of drain-level behaviour that passed for human interaction on the internet.
To suggest a moral high point is to arrogate higher civilizational (dare one use the word these days?) properties to a quibble, a scatty skirmish.
More problematically, any action removing discomforting content has a broader implication, stifling conversations and discourse that might be seen, subjectively, to be offensive, an essential feature of much human intercourse. The acquisition of genuine knowledge is rarely a comfortable, let alone safe thing.
YouTube was never going to come through with a clear line, though it decided to demonetize Crowder after initially concluding that the offensive videos, while “clearly hurtful” did not “violate our policies.” Words desperately seeking a spine were mentioned. “Even if a video remains on our site, it doesn’t mean we endorse/support that viewpoint.”
The company’s public relations team is no doubt going through the sleepless routine of placating and pacification. “Today,” the company tweeted on June 5, “has generated a lot of questions and confusion. We know it hasn’t been easy for everyone. Going forward, we’ll be taking a closer look at our own harassment policies, with the aim to update them.”
The point of taking aim at Crowder is a quasi-judicial assessment of his case. No laws were broken, nor rules infringed except a certain understanding of community guidelines, liberally interpreted under pressure. YouTube was keen to give a digital surgeon’s answer to the whole Maza-Crowder mess, one confused by what to do with its scalpel: “Thanks again for taking the time to share all of this information with us. We take allegations of harassment very seriously – we know this is important and impacts a lot of people.”
Identifying the issues behind this crib worthy spat are not difficult. Grievance will have its day, and Maza is working the identity system like a gibbering pro. Glenn Greenwald of The Intercept, who has a good line in understanding what it is like being harassed and mocked on matters of sexual orientation and politics, suggested that it would never occur to him to “run to social media companies to beg for censorship.”
To Tucker Carlson, he put forth his vision: “I don’t want to live in a world where our discourse is policed and determined by benevolent overlords, who run Silicon Valley companies, you know, who are always going to cater to the most powerful faction.”
Perhaps we are already too late.
Maza is not looking beyond his personal issue, which has, as with others in the offended business, a way of magnifying. The aggrieved want vengeance, seeing a world in a tweet and eternity in a video.
YouTube rewards engaging content. Hate speech is engaging. So YouTube rewards hate speech.”
The implication in this silly summation by Maza is obvious: avoid the platform or badger the platform into heeding his interests.
Over to you, YouTube.
Originally published on CounterPunch
Binoy Kampmark was a Commonwealth Scholar at Selwyn College, Cambridge. He lectures at RMIT University, Melbourne. Email: [email protected]
For direct-transfer bank details click here.