I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
> What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.
I’d also argue for demonetising political content, but idk if that would fly.
Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.
Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.
Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)
> For all content or just “political”?
The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.
I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)
Bonus: electeds get constituent pressure to consolidate elections.
Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.
May I suggest only repealing it for companies that generate more than a certain amount of revenue from advertising, or who have more than N users and have algorithmic content elevation?
If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?
This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.
"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".
> If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?
Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.
Voltaire was right when he said "Those who can make you believe absurdities can make you commit atrocities." And social media lets authoritarians push absurdities to millions of people.
Freedom of speech is not freedom of reach. Sadly There will always be racists, misogynists, anti-Semites, and child abusers. We should not be giving bigots and pedophiles a free platform to amplify their views and target their victims.
Zuckerberg says people should decide what's credible, not tech companies. When 2/3rds of millennials have not heard of Auschwitz how are they supposed to know what's true? There is such a thing as objective truth. Facts do exist.
Social media platforms in the United States rely heavily on Section 230 of the Communications Decency Act, which provides them immunity from liability for most user-generated content.
This would cause widespread censorship of anything remotely controversial, including the truth. We'd be in a "censor first, ask questions later" society. Somehow that doesn't seem healthy either.
Have you visited nytimes.com in recent months? Just this morning the top headline was about the lies Trump told at the UN. That's pretty controversial - the newspaper of record calling the sitting president a liar. That's not allowed in many or most countries, but it is allowed in the US. And Trump is suing New York Times for $15 billion, for defamation. That didn't intimidate NYT. They are willing to stand behind the articles they publish. If you can't stand behind what you publish, don't publish them.
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?