D
Deleted member 365
Guest
We already regulate free speech.Who decides what is misinformation and what isn't? Who decides what content is dangerous to mental health and what isn't?
Ponder that for a second, and hopefully you see how terrifying that can quickly become. The first amendment is a Godsend, and yes you can point at issues created by freedom of speech, but the alternative is far worse, and history is littered with examples of why.
- Can you yell "fire" in a theater?
- Have you ever tried hosting a concert in your neighborhood at 3:00 am?
- We have laws against revenge porn.
- Do you find restrictions against child porn to infringe on your free speech?
Now, let's focus for a bit on social media. It's no secret that social media has dethroned the printed word as the primary source of news and information for most Americans. Elon Musk and Mark Zuckerberg have claimed that their social media platforms are the new "town square." Yet, social media platforms have resisted the same regulations and restrictions that held print journalism accountable. The new "town square" has become a sewer of disinformation, hate communities, and otherwise harmful products detrimental to our society. Listen, if social media wants all the power of what print journalism had 50 years ago, then it needs to abide by certain standards. Right now, it hasn't any. And it's hurting our society. Observe:
1. American Enemies Exploit Social Media: Just recently social media was used once again by Russia to promote the Kremlin's talking pts. With a few million bucks, it paid several right wing propagandists to derail the American-led effort to aid Ukraine. This is a national security risk. We know from both the Mueller Report and the Republican-led Senate Report in 2020 that America's greatest enemies, China, Iran, North Korea, and Russia are exploiting social media to divide the west. We need to raise our defenses here.
2. Social Media Hurts Mental Health: Social media companies know that their algorithms hurt people, especially young people. Because they prioritize keeping customers on their platforms and not accurately informing them, they don't care about mental health. Similar to the tobacco companies of the 1990s, they're fighting regulation so they can continue to exploit young people, even if it kills them. Why are we tolerating this?
3. Disinformation Leads to Death: We are currently seeing how Facebook has inspired conspiracies against Haitians living in Ohio. We have seen it spread disinformation about elections, disease and vaccines, and even genocide. A few years back, it led to genocide in Thailand. Quite honestly, if Facebook knew it could be held accountable for the hate and death that it inspires, it would be compelled to change. At the very least, do away with its algorithms. If they want to be the "new town square" then they need to have some sort of standard. We also saw during the Trump administration people who innocently liked a community or post about "reopening schools" and then a few clicks later, due to the algorithm, were led to extremist communities and right wing hate groups. This isn't how a democracy is sustained. I recommend reading this article. It's alarming.
4. Regulation of Toxic Social Media for Better Outcomes is a Net Good for Society: Having standards, improving the transparency, and making social media healthier isn't a bad thing. I liken this to the formation of the FDA 100 years ago. A century ago, food producers didn't need to provide expiration dates for their products or actually provide accurately labeled drugs for their medicine. Like today, there was a need and there was push back from "free speech absolutists" who felt that the government forcing food and drug producers to be held to certain standards was a step too far by the government. Over time, the FDA has proven its worth. Protecting consumers from rancid food or inaccurately labeled drugs or medicines tainted with glass shards has saved countless lives. I believe that the same must be done to social media companies.
As usual, I'll provide a few ideas supported by experts to demonstrate the seriousness of this topic. Here are a few ideas to help make social media more accountable and responsible to make it safer for Americans:
1. Transparency: If social media companies continue to use their algorithms, then they must be transparent in why they're showing what they're showing in one's newsfeed.
2. Create a New Non-Partisan Agency: Much like the FDA, create a new government agency to establish rules and regulate social media companies. This agency could team up with fact-checkers to ensure that social media platforms aren't merely being used by malicious foreign powers or bad faith actors.
3. Break Up Monopolies: Currently, Facebook connects over 2 billion people to its network. So a disinformation campaign or a hate community will find plenty of consumers, especially if a few clicks of the algorithm connect innocent people to these groups. There comes a point where one company is just too powerful. Break it up. Encourage competition so that the free market can thrive.
4. Enable Social Media Companies to be Held Liable: If the disinformation that social media companies refuse to take down or correct lead to harm, they need to be held liable just as any newspaper would. They shouldn't be given special privileges.
Lastly, this was a time consuming post. This isn't directed at any particular user, but if people want to join in on this conversation, I ask that you refrain from the usual ad-hominem attacks and silly posts. Your post will be ignored and I will block you. Frankly, I just don't have the time to waste on stupid people on here. So if you want to have a serious conversation about this topic, back your opinion up with some evidence. Otherwise, don't post. No one is forcing you to reply to anything I post. This is your only warning.
References:
View: https://podcasts.apple.com/us/podcast/al-franken-madiba-dennie-steven-brill/id1645614328?i=1000657366955
We Can Regulate Social Media Without Censorship. Here’s How
Rather than regulatory perfection, we can aim for a reduction in imperfection. Instead of aiming to prevent all online harm, we can aim for a reduction in that risk.
https://www.rand.org/pubs/commentary/2023/10/protecting-free-speech-compels-some-form-of-social.html
I think this is worth a watch:
View: https://www.ted.com/talks/eli_pariser_what_obligation_do_social_media_platforms_have_to_the_greater_good?subtitle=en
Last edited by a moderator:
