In part, the motivation for this book comes from observing the ahistorical nature of discussions about technology. This has, at best, led to a benign yet thoughtless form of technological optimism. “When you give everyone a voice and give people power, the system usually ends up in a really good place,” declared Mark Zuckerberg back in the early days of Facebook, with an impressive combination of naiveté and disingenuousness. At worst, and dismayingly, this sees revolutionary moments recast as cultural shifts generated by disruptive thought leaders: history understood as the march of great entrepreneurial CEOs. This kind of thinking sees the future as defined by universal progress—rather than by a messy, contradictory struggle between different interests and forces—and never driven by the aspirations of those from below. It reduces the value of human agency to entrepreneurialism and empty consumerism.
History has a role in telling us about the present but not if we use a frame that valorizes those who currently hold positions of power. We need to reclaim the present as a cause of a different future, using history as our guide.
In part, the motivation for this book comes from observing the ahistorical nature of discussions about technology. This has, at best, led to a benign yet thoughtless form of technological optimism. “When you give everyone a voice and give people power, the system usually ends up in a really good place,” declared Mark Zuckerberg back in the early days of Facebook, with an impressive combination of naiveté and disingenuousness. At worst, and dismayingly, this sees revolutionary moments recast as cultural shifts generated by disruptive thought leaders: history understood as the march of great entrepreneurial CEOs. This kind of thinking sees the future as defined by universal progress—rather than by a messy, contradictory struggle between different interests and forces—and never driven by the aspirations of those from below. It reduces the value of human agency to entrepreneurialism and empty consumerism.
History has a role in telling us about the present but not if we use a frame that valorizes those who currently hold positions of power. We need to reclaim the present as a cause of a different future, using history as our guide.
[...] As the planet slides further toward a potential future of catastrophic climate change, and as society glorifies billionaires while billions languish in poverty, digital technology could be a tool for arresting capitalism’s death drive and radically transforming the prospects of humanity. But this requires that we politically organize to demand something different.
[...] As the planet slides further toward a potential future of catastrophic climate change, and as society glorifies billionaires while billions languish in poverty, digital technology could be a tool for arresting capitalism’s death drive and radically transforming the prospects of humanity. But this requires that we politically organize to demand something different.
These methodologies for predicting and shaping our behavior have grown more sophisticated over the first two decades of the twenty-first century. Collection and analysis of big data about people is a well-established industry. It includes the companies collecting data (miners), those trading it (brokers) and those using it to generate advertising messages (marketers), often with overlap between all three. It can be hard to obtain reliable estimates of the size of the industry, given its complexity, but one study says that by 2012 it was worth around $156 billion in the United States and accounted for 675,000 jobs. It has undoubtedly grown since then. Like slum landlords who rent out dilapidated apartments, or greedy hotshot developers who take advantage of legal loopholes to build luxury condos, companies that trade in personal data represent the sleazy side of how digital technology is impacting the real estate of our minds. This industry uses the faux luxuries of choice and convenience to entice us to part with our data, but often what they are really selling is overpriced and dodgy.
interesting analogy
These methodologies for predicting and shaping our behavior have grown more sophisticated over the first two decades of the twenty-first century. Collection and analysis of big data about people is a well-established industry. It includes the companies collecting data (miners), those trading it (brokers) and those using it to generate advertising messages (marketers), often with overlap between all three. It can be hard to obtain reliable estimates of the size of the industry, given its complexity, but one study says that by 2012 it was worth around $156 billion in the United States and accounted for 675,000 jobs. It has undoubtedly grown since then. Like slum landlords who rent out dilapidated apartments, or greedy hotshot developers who take advantage of legal loopholes to build luxury condos, companies that trade in personal data represent the sleazy side of how digital technology is impacting the real estate of our minds. This industry uses the faux luxuries of choice and convenience to entice us to part with our data, but often what they are really selling is overpriced and dodgy.
interesting analogy
The most valuable consumer platforms have both the capacity to collect highly valuable personal data and the opportunity to use it to market to users at the most lucrative moments of their daily lives. These are the places in which the invisible hand of what I call technology capitalism is at work—between data miners and advertisers, with data on users as the commodity being traded.
the fckin gateways!!
The most valuable consumer platforms have both the capacity to collect highly valuable personal data and the opportunity to use it to market to users at the most lucrative moments of their daily lives. These are the places in which the invisible hand of what I call technology capitalism is at work—between data miners and advertisers, with data on users as the commodity being traded.
the fckin gateways!!
There is much still to be won and lost in the battle for our online autonomy in the future. As the next generation of web technology improves the integration of all our digital activities, allowing machines to organize even more of our lives, others will continue to learn more about us than we even know ourselves. In this context, focusing on our power over this process as consumers is a mistake: the power being exercised over us is precisely based on our being socialized as consumers. This prepares us to accept a city where every park is paved over to build freeways and every sports field and roller-skating rink is demolished to build a shopping mall. Such a city would not be a functional, let alone enjoyable, place to live. But it would be a place where data traders and retailers made a lot of money.
love this
There is much still to be won and lost in the battle for our online autonomy in the future. As the next generation of web technology improves the integration of all our digital activities, allowing machines to organize even more of our lives, others will continue to learn more about us than we even know ourselves. In this context, focusing on our power over this process as consumers is a mistake: the power being exercised over us is precisely based on our being socialized as consumers. This prepares us to accept a city where every park is paved over to build freeways and every sports field and roller-skating rink is demolished to build a shopping mall. Such a city would not be a functional, let alone enjoyable, place to live. But it would be a place where data traders and retailers made a lot of money.
love this
This is not an attempt to pin blame on evil engineers or designers. The people who made these cars were working in a specific corporate climate. Their organizations were led by ruthless executives. The leadership of companies like Ford and GM ignored safety concerns in competition with other companies that did the same. It was not even a problem confined to the auto industry; there have been many other similar scandals involving corporate indifference to the human consequences of poorly designed consumer products. These scandals are not aberrant; they occur in a context, and to avoid them happening again requires a political strategy to attack the logic that produces them.
on the ford pinto
This is not an attempt to pin blame on evil engineers or designers. The people who made these cars were working in a specific corporate climate. Their organizations were led by ruthless executives. The leadership of companies like Ford and GM ignored safety concerns in competition with other companies that did the same. It was not even a problem confined to the auto industry; there have been many other similar scandals involving corporate indifference to the human consequences of poorly designed consumer products. These scandals are not aberrant; they occur in a context, and to avoid them happening again requires a political strategy to attack the logic that produces them.
on the ford pinto
Computer code itself functions as a form of law. It is written by humans and it regulates their behavior, like other systems of power distribution. It is not an objective process or force of nature. It expresses a power relation between coder and user, and it will reflect the system in which coders work. “Code is never found,” Lawrence Lessig reminds us. “It is only ever made, and only ever made by us.” Letting the free market determine these matters means that digital technology risks reproducing discrimination under the cover of an inscrutable process. [...]
Computer code itself functions as a form of law. It is written by humans and it regulates their behavior, like other systems of power distribution. It is not an objective process or force of nature. It expresses a power relation between coder and user, and it will reflect the system in which coders work. “Code is never found,” Lawrence Lessig reminds us. “It is only ever made, and only ever made by us.” Letting the free market determine these matters means that digital technology risks reproducing discrimination under the cover of an inscrutable process. [...]
The implication was that these modifications to the standard testing process were done in response to public outrage, generated by Dowie’s article and the lawsuits. Ford was held to standards that other companies were not—standards it had no way of knowing that it was required to meet. Ford has never admitted that it did anything wrong.
Yet the lesson is this: as we learn more about an industry and what is possible technologically, we need to update our expectations in terms of safety and accountability. We need to organize activists, lawyers and journalists to highlight the human consequences of badly designed technology, and force the industry to adapt to design culture that values safety and works to mitigate bias. We need to demand that governments intervene in this industry to establish publicly determined standards and methods for holding companies accountable when they are breached. The standards must be constantly updated and responsive to changing circumstances as we learn more about the problems and experiment with solutions. Just like we would not let a car onto the road without crash tests, the parameters of which are subject to public scrutiny and influence, algorithms and products should not be inflicted on the public if they have not met certain standards or been tested for certain biases before shipping. We need to create a feedback loop for good design that allows lessons learned in the field to inform improvements to a product.
The implication was that these modifications to the standard testing process were done in response to public outrage, generated by Dowie’s article and the lawsuits. Ford was held to standards that other companies were not—standards it had no way of knowing that it was required to meet. Ford has never admitted that it did anything wrong.
Yet the lesson is this: as we learn more about an industry and what is possible technologically, we need to update our expectations in terms of safety and accountability. We need to organize activists, lawyers and journalists to highlight the human consequences of badly designed technology, and force the industry to adapt to design culture that values safety and works to mitigate bias. We need to demand that governments intervene in this industry to establish publicly determined standards and methods for holding companies accountable when they are breached. The standards must be constantly updated and responsive to changing circumstances as we learn more about the problems and experiment with solutions. Just like we would not let a car onto the road without crash tests, the parameters of which are subject to public scrutiny and influence, algorithms and products should not be inflicted on the public if they have not met certain standards or been tested for certain biases before shipping. We need to create a feedback loop for good design that allows lessons learned in the field to inform improvements to a product.
Digital technology, at least as much as any innovation over the last two centuries, offers us the opportunity to create a society that can meet the needs of every human being and allow them to explore their potential. But at present, too much power over the development of technology rests in the hands of technology capitalists and political elites who do their bidding. These people are good at what they do and also at convincing us that they are the best people to do it. They talk about egalitarianism and social connection in their public relations pitches and marketing campaigns, but what they hope to gain from digital technology is different: they aspire to wealth and power.
what if some of them actually believe in it tho. that's the scariest thing to me
Digital technology, at least as much as any innovation over the last two centuries, offers us the opportunity to create a society that can meet the needs of every human being and allow them to explore their potential. But at present, too much power over the development of technology rests in the hands of technology capitalists and political elites who do their bidding. These people are good at what they do and also at convincing us that they are the best people to do it. They talk about egalitarianism and social connection in their public relations pitches and marketing campaigns, but what they hope to gain from digital technology is different: they aspire to wealth and power.
what if some of them actually believe in it tho. that's the scariest thing to me
These are serious and frightening design problems, but they also result in an immense waste of human potential. It is not just that companies do not invite feedback on their software or input from users on their design. Their objective, the purpose of their software, is not to service the user. Their primary goal is to retain control of that software. They want to control who uses it (that is, only paying customers). Proprietary software design makes a fetish of creativity—turning it into something abstract and commodified, geared to the purpose of money-making, rather than a collective or public good.
These are serious and frightening design problems, but they also result in an immense waste of human potential. It is not just that companies do not invite feedback on their software or input from users on their design. Their objective, the purpose of their software, is not to service the user. Their primary goal is to retain control of that software. They want to control who uses it (that is, only paying customers). Proprietary software design makes a fetish of creativity—turning it into something abstract and commodified, geared to the purpose of money-making, rather than a collective or public good.