If Trump uses Facebook and incites violence and someone dies because of it. Is FB complicit?
It's a bit of a thought experiment. In the extreme case, let's take for example ISIS. They were using social media platforms as a recruitment platform. Twitter put forth their best effort in shutting down this behavior. However, let's take for example an alternate reality. If they had done nothing at all despite knowing it was a problem, I would have said they were complicit, or at least should have been held partially liable.
In today's world where Trump is using social media as a platform to spread misinformation (information that is by all interpretations literally untrue) and incite violence, since FB stance is that they won't do anything about it, aren't they, at least morally, partially responsible for what can transpire?
And, for all of those working at FB and not speaking up, aren't they the "moderate majority" who are also partially complicit and morally responsible?
#floyd #fb #twitter
comments
Social media is so incredibly powerful. It can both oppress and liberate people. It can even influence the fate of countries through elections. I would say it's the nuclear warhead equivalent of today's modern era and it's crazy to think that the keys to the tool is in the hands of private companies... although, having that tool being controlled by governments is actually way scarier.
Let's say the F-150 can be modified in such a way where you can attach a heavy machine gun. The heavy machine gun can be used to kill a lot of people.
I believe this actually happened where ISIS was using Toyota trucks and literally mounted machine guns to it.
If Toyota knew that their systems can be easily modified to be used as a weapon of war or that their supply chains could be easily compromised to steal their trucks, and they did not at least try to do something about it, I think they are partially morally complicit, yeah.
I believe the end result is that Toyota complied with US inquiry and put in effort to stop ISIS from getting these trucks.
It will surely happen many more times, especially if Trump is reelected.
Today, Twitter is used a lot to dox people which leads to bullying and sometimes even legitimate threats against life. Let's say in this alternative reality, Twitter was just 80% doxing, bullying and causing a mass increase in suicides.
For the sake of argument, let's say that Twitter was responsible for at least thousands of suicides per year, in the sense that, without Twitter, there would not have been a platform for bullies to openly dox an individual. Is this behavior / technology ok? Should this be culled? Or in your view, because it is just a platform, it should exist as it is?
Bullying is of course a more subtle case, but I'm sure in extreme cases there might be liability to the platform if no action were taken.
This is an extreme case, but it's why social media companies even have content policies to begin with. It's not "censorship"