Entrusting our speech to multiple different corporate actors is always risky. Yet given how most of the internet is currently structured, our online expression largely depends on a set of private companies ranging from our direct Internet service providers and platforms, to upstream ISPs (sometimes...
The problem is that your definitions are incredibly vague.
What is a “platform” and what is a “host”?
A host, in the definition of technology, could mean a hosting company where you would “host” a website from. If it’s a private website, how would the hosting company moderate that content?
And that’s putting aside the legality and ethics of one private company policing not only another private company, but also one that’s a client.
Fair point about hosts, I’m talking about platforms as if we held them to the standards we hold publishers to. Publishing is protected speech so long as it’s not libelous or slanderous, and the only reason we don’t hold social media platforms to that kind of standard is that they demanded (and received) complete unaccountability for what their users put on it. That seemed okay as a choice to let social media survive as a new form of online media, but the result is that for-profit social media, being the de facto public square, have all the influence they want over speech but have no responsibility to use that influence in ways that aren’t corrosive to democracy or to the public interest.
Big social media already censor content they don’t like, I’m not calling for censorship in an environment that has none. What I’m calling for is some sort of accountability to nudge them in the direction of maybe not looking the other way when offshore troll farms and botnets spread division and disinformation