Forums
Guides
Features
Media
Zelda Wiki
Patreon
    Youtube Demonetization of Extremist Content
    • What do you guys think about VoxAdpocalypse? Is YouTube right to demonetize people for having an opinion they don't agree with? Actually, no, I wouldn't even go that far, it's just journalists pressuring ad providers to in turn pressure YouTube. This is a new era for these online platforms. I think it's important for opinions to be heard from all sides, so we can decide on our own who is dumb and who isn't, as long as no laws are broken in the process.

      Then again, I don't think people should be taking YouTube revenue for granted, in the end it is a private business. At the same time, it IS recognized as the modern public square and a platform. I would like it to be a fully open platform but it ain't. Nor is Facebook, or Twitter. I suppose you could argue that in YouTube's case the content is still there, it's just demonetized, but without that incentive, it will inevitably lead to this content getting eclipsed.
      "Can't post that on a Christian forum."

      The post was edited 3 times, last by Please Understand ().

    • Crowder wasn't being called out (and subsequently demonetized) for having opinions people disagree with. Unless repeatedly using homophobic and racist language against Carlos Maza can be construed as an opinion and not harassing hate speech that incited his followers to further go after Maza on social media and even doxxing him.

      By their own stated policies on harassment, Crowder's offending videos should be removed by Youtube, not just demonetized.
    • Galedeep wrote:

      Crowder wasn't being called out (and subsequently demonetized) for having opinions people disagree with. Unless repeatedly using homophobic and racist language against Carlos Maza can be construed as an opinion and not harassing hate speech that incited his followers to further go after Maza on social media and even doxxing him.

      By their own stated policies on harassment, Crowder's offending videos should be removed by Youtube, not just demonetized.
      I'm not talking about Crowder in specific, I don't really know him but my understanding is that he got rightfully demonetized, and YouTube even said he will get remonetized if he takes down the hateful content. There's other YouTubers that didn't do anything wrong and got caught up in this shitstorm.
      "Can't post that on a Christian forum."
    • Crowder should be removed from YouTube completely, as he has repeatedly violated YouTube's own terms of agreement mentioned by Galedeep. Also, the demonetization isn't as impactful on him as it is with other youtubers, because he makes a lot of money off of apparel and other merchandise (which he advertizes on his channel).

      When you have rules, you should enforce them. It would be one thing if YouTube let a handful of these cases slip through the cracks, as no one is perfect, but they have a clear pattern of allowing objectionable content that breaks their own rules. This behavior gets reported constantly across social media, and usually you get a response like "thanks for reporting, but we found this doesn't violate our terms of service", when it's so obviously not the case.

      The primary reason these social media giants aren't more proactive in enforcing their terms is because they're afraid of the backlash from conservatives/right-wingers, who threaten physical and legal consequences to strongarm online platforms into harboring hate speech and violating their freedom of association.



      The post was edited 5 times, last by Viajero de la Galaxia ().

    • Please Understand wrote:

      There's other YouTubers that didn't do anything wrong and got caught up in this shitstorm.
      My understanding (which isn't comprehensive) is that the thing with Crowder is pretty specific in its scope (though obviously demonstrative of a wider, systemic issue with youtube). I haven't seen any other creators or channels that have faced any sort of repercussions as a result...

      It did happen around the same time that YouTube finally has taken steps to remove "hateful and supremacist content" from their platform...but anyone who got caught up in that also likely deserved to be deplatformed since their opinions were, you know, hateful and supremacist. And it's unrelated to the Crowder stuff.
    • You might be right about Crowder, but the timing is at least interesting because it happened on the same wave of demonetizations.

      Galedeep wrote:

      It did happen around the same time that YouTube finally has taken steps to remove "hateful and supremacist content" from their platform...but anyone who got caught up in that also likely deserved to be deplatformed since their opinions were, you know, hateful and supremacist. And it's unrelated to the Crowder stuff.
      This, however, is a massive assumption. There are plenty of people not hateful or supremacist that got caught in it because of the modifications on the YouTube algorithms for detecting such content. You should check out YouTube's Twitter, the changes affected a ton of channels. But yes it might be unrelated to the Crowder stuff.... well, not completely, as it is still under the same category kinda, but that's a separate hand-picked incident at least.


      ich Will wrote:

      Which ones?
      Ford Fischer, for one.
      "Can't post that on a Christian forum."

      The post was edited 1 time, last by Please Understand ().

    • Kerest wrote:

      Ford Fischer was a primary target of the crackdown. Our records show he was on the list of names specifically asked to be deplatformed when we heard Google was planning to do so. Threat assessment algorithms determined he was a secondary spread of extremist views through his coverage of those views and how some people reacted to them.
      Your records?

      I think covering and documenting extremist behaviour (of both sides) is important. Suppressing this content will only turn them into martyrs and will push more people towards them. Besides, it matters which light you cover the material from. Just discussing something is never hateful by itself, even if the content itself can be used in a hateful manner or is hateful.
      "Can't post that on a Christian forum."
    • Please Understand wrote:

      I think covering and documenting extremist behaviour (of both sides) is important. Suppressing this content will only turn them into martyrs and will push more people towards them. Besides, it matters which light you cover the material from. Just discussing something is never hateful by itself, even if the content itself can be used in a hateful manner or is hateful.
      It depends on your goal.

      If your goal is simply to document for historical purposes, then coverage of both sides is necessary.

      If, however, your goal is to remove a movement that has become a massive societal problem, then you need to remove all sources of propaganda spread of that movement for a prolonged period of time. This does, unfortunately, mean that people who are acting with good intentions but are unintentionally sources of spread become necessary collateral damage.

      So, which is more important to you: Documenting extremist behavior, or doing away with the extremist movement?
    • Kerest wrote:

      So, which is more important to you: Documenting extremist behavior, or doing away with the extremist movement?
      Demonetizing the documentation and discussion about extremist movements is not going to make them go away, though. The content is not deleted, for one, and even if it was, it still wouldn't be doing a service to society. You might argue that it does, but I would argue otherwise. I don't think deplatforming hate is going to make the hate go away. If anything, it can have the opposite effect - it can polarize people even more, them vs us.

      And, in fact, what I think is going to happen is that this whole thing going on with Big Tech banning/deplatforming/silencing people is going to cause government intervention.
      "Can't post that on a Christian forum."

      The post was edited 1 time, last by Please Understand ().

    • Please Understand wrote:

      Kerest wrote:

      So, which is more important to you: Documenting extremist behavior, or doing away with the extremist movement?
      Demonetizing the documentation and discussion about extremist movements is not going to make them go away, though. The content is not deleted, for one, and even if it was, it still wouldn't be doing a service to society. You might argue that it does, but I would argue otherwise. I don't think deplatforming hate is going to make the hate go away. If anything, it can have the opposite effect - it can polarize people even more, them vs us.
      And, in fact, what I think is going to happen is that this whole thing going on with Big Tech banning/deplatforming/silencing people is going to cause government intervention.
      It depends on how long you do it for and how long they are isolated.

      See, the problem with a lot of the modern social justice movements is they have no sense of scale or time. A hate movement isn't going to go away just because it was demonized in the 1960s. You have to cut it off, isolate it, and starve it for at least a couple centuries. Give it time for the movement to starve across multiple generations. That doesn't mean just demonetizing them now; it means demonetizing them for the next four or five generations.

      Of course, that's the slow way. No one is comfortable with the fast way, and for good reason.

      As for government intervention, I have two questions for you:

      1. What makes you think various parts of the U.S. government are not already involved?

      2. What makes you think government intervention, with the hopes the government will not permanently codify it, isn't the end goal?
    • Kerest wrote:

      1. What makes you think various parts of the U.S. government are not already involved?

      2. What makes you think government intervention, with the hopes the government will not permanently codify it, isn't the end goal?
      I'm talking about government intervention on the platforms themselves. The platforms are the modern public square, and are suppressing the right to free speech of individuals based on untransparent criteria. I think it's naive to think this will continue forever in USA, as long as 1st amendment remains as it is.
      "Can't post that on a Christian forum."
    • How many times do people have to explain this.

      YouTube is a private entity. It is not breaking A N Y L A W S by restricting who has the ability to make and monetize their content. They could announce tomorrow that they will only allow content that espouses the belief that pizza is an overrated food, and it would be perfectly okay because they aren't breaking any laws by restricting what is allowed on their platform.

      All together now:

      THEY AREN'T BREAKING ANY LAWS.
    • @Please Understand

      What's untransparent about their criteria? Anyone can look up YouTube or Facebook's terms of service for themselves, it's not like it's secret.

      If the US government were to eventually compel social media sites to host extremist content I think you'd see a lot of sites migrate more of their operations to other jurisdictions, like the EU. And where do you draw the line on being in the "modern public square"? If this were to be applied to ZU I'm sure the servers would be migrated out of the US. It would be a massive breach of freedom of association under the 1st amendment.



      The post was edited 1 time, last by Viajero de la Galaxia ().

    • Viajero de la Galaxia wrote:

      What's untransparent about their criteria?
      What they consider hateful content. It is not obvious at all.


      Viajero de la Galaxia wrote:

      If the US government were to eventually compel social media sites to host extremist content I think you'd see a lot of sites migrate more of their operations to other jurisdictions, like the EU.
      It's not about compelling them to host extremist content. It's about allowing people to discuss subjects that are not necessarily pleasant. First off, it doesn't matter if the sites move to Europe, they will still need to comply to US laws if they want to operate in US. Secondly, Europe is not happy with the big tech companies either, if anything they are much less so than the US - perhaps in different ways, but big tech getting bigger is, I think, a global enemy.
      "Can't post that on a Christian forum."