Welcome to Bookmarker!

This is a personal project by @dellsystem. I built this to help me retain information from the books I'm reading.

Source code on GitHub (MIT license).

The role of these systems is not just to reproduce inequalities, but to naturalize them. Capitalist difference-making has always required a substantial amount of ideological labor to sustain it. For hundreds of years, philosophers and priests and scientists and statesmen have had to keep saying, over and over, that some people really are less human than others — that some people deserve to have their land taken, or their freedom, or their bodies ruled over or used up, or their lives or labor devalued. These ideas do not sprout and spread spontaneously. They must be very deliberately transmitted over time and space, across generations and continents. They must be taught in schools and churches, embodied in laws and practices, enforced in the home and on the street.

It takes a lot of work. Machine learning systems help automate that work. They leverage the supposed authority and neutrality of computers to make the differences generated by capitalism look like differences generated by nature. Because a computer is saying that Black people commit more crime or that women can’t be software engineers, it must be true. To paraphrase one right-wing commentator, algorithms are just math, and math can’t be racist. Thus machine learning comes to automate not only the production of inequality but its rationalization.

—p.97 From Manchester to Barcelona (83) by Ben Tarnoff 4 years, 9 months ago