The challenge of the “knowledge problem” is just one example of a general truth: What we do and don’t know about the social (as opposed to the natural) world is not inherent in its nature, but is itself a function of social constructs. Much of what we can find out about companies, governments, or even one another, is governed by law. Laws of privacy, trade secrecy, the so-called Freedom of Information Act— all set limits to inquiry. They rule certain investigations out of the question before they can even begin. We need to ask: To whose benefit?
The challenge of the “knowledge problem” is just one example of a general truth: What we do and don’t know about the social (as opposed to the natural) world is not inherent in its nature, but is itself a function of social constructs. Much of what we can find out about companies, governments, or even one another, is governed by law. Laws of privacy, trade secrecy, the so-called Freedom of Information Act— all set limits to inquiry. They rule certain investigations out of the question before they can even begin. We need to ask: To whose benefit?
More benignly, perhaps, these companies influence the choices we make ourselves. Recommendation engines at Amazon and YouTube affect an automated familiarity, gently suggesting offerings they think we’ll like. But don’t discount the significance of that “perhaps.” The economic, political, and cultural agendas behind their suggestions are hard to unravel. As middlemen, they specialize in shifting alliances, sometimes advancing the interests of customers, sometimes suppliers: all to orchestrate an online world that maximizes their own profits.
More benignly, perhaps, these companies influence the choices we make ourselves. Recommendation engines at Amazon and YouTube affect an automated familiarity, gently suggesting offerings they think we’ll like. But don’t discount the significance of that “perhaps.” The economic, political, and cultural agendas behind their suggestions are hard to unravel. As middlemen, they specialize in shifting alliances, sometimes advancing the interests of customers, sometimes suppliers: all to orchestrate an online world that maximizes their own profits.
So why does this all matter? It matters because authority is increasingly expressed algorithmically. Decisions that used to be based on human reflection are now made automatically. [...]
[...] In their race for the most profitable methods of mapping social reality, the data scientists of Silicon Valley and Wall Street tend to treat recommendations as purely technical problems. The values and prerogatives that the encoded rules enact are hidden within black boxes.
So why does this all matter? It matters because authority is increasingly expressed algorithmically. Decisions that used to be based on human reflection are now made automatically. [...]
[...] In their race for the most profitable methods of mapping social reality, the data scientists of Silicon Valley and Wall Street tend to treat recommendations as purely technical problems. The values and prerogatives that the encoded rules enact are hidden within black boxes.
(verb) to make faulty or defective; impair / (verb) to debase in moral or aesthetic status / (verb) to make ineffective
While neoliberals were vitiating the regulatory state’s ability to expose (or even understand) rapidly changing business practices,
While neoliberals were vitiating the regulatory state’s ability to expose (or even understand) rapidly changing business practices,