If Trump uses Facebook and incites violence and someone dies because of it. Is FB complicit?

May 30, 2020 26 Comments

It's a bit of a thought experiment. In the extreme case, let's take for example ISIS. They were using social media platforms as a recruitment platform. Twitter put forth their best effort in shutting down this behavior. However, let's take for example an alternate reality. If they had done nothing at all despite knowing it was a problem, I would have said they were complicit, or at least should have been held partially liable.

In today's world where Trump is using social media as a platform to spread misinformation (information that is by all interpretations literally untrue) and incite violence, since FB stance is that they won't do anything about it, aren't they, at least morally, partially responsible for what can transpire?

And, for all of those working at FB and not speaking up, aren't they the "moderate majority" who are also partially complicit and morally responsible?

#floyd #fb #twitter

152 PARTICIPANTS SELECT ONLY ONE ANSWER
VOTE VIEW RESULT

comments

Want to comment? LOG IN or SIGN UP
TOP 26 Comments
  • AZEK
    aIPc56

    AZEK

    aIPc56
    FB is an easy target for everyone at this point. The impact of Fox News' nightly segments across decades is way more impactiful in shaping viewpoints than twitter and FB will ever be. Without twitter and FB we wouldn't know the extent to which racism is prevalent amongst our immediate social circles.
    May 30, 2020 3
    • AZEK
      aIPc56

      AZEK

      aIPc56
      I believe that FB realizes this opens a whole new can of worms. Hence their policy to limit content based on direct misinformation and direct threats of violence. It would be almost impossible to police and monitor posts across languages for implied misinformation and implied threats of violence. If a person with 50 million followers posts something with misinformation should that be treated differently from someone with 10 followers etc. They will need 10s of thousands of humans manually vetting posts across the platform and the vetting is only as good as the internal biases of the vetters.
      May 30, 2020
    • OP
      Yeah. I acknowledge that it's tough for sure. There might be no solution here.

      Social media is so incredibly powerful. It can both oppress and liberate people. It can even influence the fate of countries through elections. I would say it's the nuclear warhead equivalent of today's modern era and it's crazy to think that the keys to the tool is in the hands of private companies... although, having that tool being controlled by governments is actually way scarier.
      May 30, 2020
  • If someone drives over another driving drunk, does the car stand trial?
    May 30, 2020 7
    • OP
      Well how about this.

      Let's say the F-150 can be modified in such a way where you can attach a heavy machine gun. The heavy machine gun can be used to kill a lot of people.

      I believe this actually happened where ISIS was using Toyota trucks and literally mounted machine guns to it.

      If Toyota knew that their systems can be easily modified to be used as a weapon of war or that their supply chains could be easily compromised to steal their trucks, and they did not at least try to do something about it, I think they are partially morally complicit, yeah.

      I believe the end result is that Toyota complied with US inquiry and put in effort to stop ISIS from getting these trucks.
      May 30, 2020
    • Oracle
      alwzangry

      Go to company page Oracle

      alwzangry
      Then I suppose you're to blame too, because you exist in the same universe.
      May 30, 2020
  • VMware
    fqAa27

    Go to company page VMware

    fqAa27
    Just to be clear, this is not really a "thought experiment", it's already happened - El Paso shooter in 2019 killed 23 people in Texas, and left a manifesto saying he was targeting Hispanic people because they are "invading his country"... And more or less quoted all the Trump posts/tweets he likes verbatim. https://slate.com/news-and-politics/2019/08/el-paso-suspect-shooter-trump-racist-manifesto.html

    It will surely happen many more times, especially if Trump is reelected.
    May 30, 2020 0
  • Uber
    Embe

    Go to company page Uber

    Embe
    It's not the platforms fault that something is said, it's the person. If someone suicides because a different person bullied them online, is it Facebook's fault or the person who posted?
    May 30, 2020 2
    • OP
      I think this argument works when the tool is completely agnostic and benign but it it's not the case in the world today.

      Today, Twitter is used a lot to dox people which leads to bullying and sometimes even legitimate threats against life. Let's say in this alternative reality, Twitter was just 80% doxing, bullying and causing a mass increase in suicides.

      For the sake of argument, let's say that Twitter was responsible for at least thousands of suicides per year, in the sense that, without Twitter, there would not have been a platform for bullies to openly dox an individual. Is this behavior / technology ok? Should this be culled? Or in your view, because it is just a platform, it should exist as it is?
      May 30, 2020
    • VMware
      fqAa27

      Go to company page VMware

      fqAa27
      It depends on the legal liability... Eg if terrorists are publicly (or even privately) recruiting on the platform Facebook can't say "it's not our fault, it's the person's fault, we are just a platform", they must take action to shut it down, or risk problems.

      Bullying is of course a more subtle case, but I'm sure in extreme cases there might be liability to the platform if no action were taken.
      May 30, 2020
  • Oracle
    ImInsideMe

    Go to company page Oracle

    ImInsideMe
    I think of social networks that host celebrities and brands as a billboard companies, owners of display space. If someone publishes something inciting on a billboard, is the billboard owner complicit?
    May 30, 2020 1
    • VMware
      fqAa27

      Go to company page VMware

      fqAa27
      Well, it depends on what they publish. If a billboard owner accepts and publishes an ISIS recruiting poster for example do you think it would be fine? Would it be legal? Would they be complicit in anything?

      This is an extreme case, but it's why social media companies even have content policies to begin with. It's not "censorship"
      May 30, 2020