Until relatively recently, the regulation of technology was largely discussed through the prism of contract. Technology is often sold or accessed as a proprietary product, licenced through contract to the user. Clickwrap terms of service on major service platforms, for example, allow companies to operate broadly on a take-it-or-leave-it basis.
The law has traditionally respected the rights of private parties in making agreements on their own terms, and courts have been reluctant to intervene and set aside these freely bargained arrangements except in the most extreme cases. But many of the contracts we enter into on almost a daily basis share little in common with the context in which the law of contract developed. This is not least because there is an absence of some of the central foundations that have traditionally supported the relevant jurisprudence. Modern digital contracts are characterized by grossly unequal bargaining power, with an absence of a meeting of minds. There is no genuine consent or understanding among users about the rights and obligations of each party. It is formalized exploitation of our digital lives for profit.
Moreover, seeing consent as something that the individual is empowered to offer is something of a category error. For example, as service platform companies collect data, this allows them to know what they know, as well as know what they do not know. Put differently, companies can make inferences about the data they have not collected from data they have. If a company has sufficient intelligence about a certain class of people, it can draw conclusions about those who fit that demographic on the basis that they are part of a lookalike audience. It is not possible to opt out of this; we all end up bound by decisions made by others to consent to invasive data collection practices. In some ways it is like buying a car with faulty brakes. It’s a consumer choice that puts not only you at risk, but also makes the road less safe for all users.