Welcome to Bookmarker!

This is a personal project by @dellsystem. I built this to help me retain information from the books I'm reading.

Source code on GitHub (MIT license).

If what we encounter on Facebook, OkCupid, and other online platforms is generally “safe for work,” it is not because algorithms have sorted through the mess and hid some of it from view. Rather, we take non-nauseating dips in the digital stream thanks to the labor of real-live human beings who sit before their own screens day and night, tagging content as vulgar, violent, and offensive. According to Chen, more people work in the shadow mines of content moderation than are officially employed by Facebook or Google. Fauxtomatons make the internet a habitable place, cleaning virtual public squares of the sort of trash that would chase most of us offline and into the relative safety of face-to-face interaction.

Today many, though not all, of the people employed as content moderators live abroad, in places like the Philippines or India, where wages are comparatively low. The darkest tasks that sustain our digital world are outsourced to poor people living in poorer nations, from the environmentally destructive mining of precious minerals and the disposal of toxic electronic waste to the psychologically damaging effects of content moderation. As with all labor relations, race, gender, and geography play a role, determining which workers receive fair compensation for their labor or are even deemed real workers worthy of a wage at all. Automation, whether real or fake, hasn’t undone these disturbing dynamics, and may well intensify them.

—p.158 The Automation Charade (149) by Astra Taylor 5 years, 11 months ago