[...] Section 2030 from the 1996 Communications Act is a piece of deregulation that says that a platform isn't responsible for the content that a user posts.
I'd like to see it amended in one specific way. There are a lot of people posting content for which you can't necessarily make the platform responsible. But if the platforms were to make the curatorial choice to promote that content in a recommendation engine, and it reached a certain number of people, they would then have to be responsible for it. I think that would have the effect of curtailing, for example, YouTube's pushing of conspiracy theories by placing certain videos in their 'top trending' boxes. [...]
it's an algorithm, sure, but it's an algorithm that has at least some human curation, and could probably have more. have simple guidelines: dont promote it if it could be construed as hate speech or sth similar acc to some guidelines. it might not be perfect but it would sure as hell be better than now
and ofc the human curators should be paid very well, given lots of employment security and authority and guidelines and report to someone high up
something to think about: the dream is to get rid of human action entirely, but maybe that's a dumb dream? an impractical one? there'll always be human curation needed as long as human society and culture and interaction cannot be described/generated by a finite algorithm