Welcome to Bookmarker!

This is a personal project by @dellsystem. I built this to help me retain information from the books I'm reading.

Source code on GitHub (MIT license).

92

Interview

0
terms
4
notes

Aza Raskin

Brach, K. (2018). Interview. In Brach, K. Offscreen, Issue 20. None, pp. 92-109

101

A lot of your work has to do with design as a solution to human and technical problems. Can better design solve inherently unethical business practices?

I don't think so. I've seen this happening in major companies: you'll work on a design that will be better for users or communities. But then it bubbles up the chain of command someone higher up sees that it's negatively impacting their KPI. And so it gets deprioritised.

Corporations are a kind of AI, in the sense that no one individual directs them, and they just operate on their objective function of maximising share price through the proxy of engagement metrics. And it's hard to push up against that.

One of the original sins of the internet was not having a good business model in mind, which meant that it defaulted to the advertising business model, which has led to surveillance capitalism, and now persuasion capitalism.

So I think that we as designers can no longer get away with living in a sort of Eden where we get to ignore business models. If we do so, our designs will always be subverted.

good answer, tho i would argue that it would have been hard to imagine an alternative trajectory given the larger political/economic trends

—p.101 by Kai Brach 4 years, 8 months ago

A lot of your work has to do with design as a solution to human and technical problems. Can better design solve inherently unethical business practices?

I don't think so. I've seen this happening in major companies: you'll work on a design that will be better for users or communities. But then it bubbles up the chain of command someone higher up sees that it's negatively impacting their KPI. And so it gets deprioritised.

Corporations are a kind of AI, in the sense that no one individual directs them, and they just operate on their objective function of maximising share price through the proxy of engagement metrics. And it's hard to push up against that.

One of the original sins of the internet was not having a good business model in mind, which meant that it defaulted to the advertising business model, which has led to surveillance capitalism, and now persuasion capitalism.

So I think that we as designers can no longer get away with living in a sort of Eden where we get to ignore business models. If we do so, our designs will always be subverted.

good answer, tho i would argue that it would have been hard to imagine an alternative trajectory given the larger political/economic trends

—p.101 by Kai Brach 4 years, 8 months ago
102

As designers, how can we convince business minds that what works isn't always what's right?

[...]

[...] Companies are made out of people and we are their most valuable assets. They can't operate without us. So, because Uber was no longer a cool place to work and people were leaving, that changed who ran that company.

As humans, we feel small sometimes against something the size of Google. By voting with our feet, and refusing to work for companies that are exploitative, we can make really big differences. You refuse to do things that you believe have gone too far down the path of giving up what's right for what works.

cool. though, tbh, idk if it's possible to refuse to work for all companies that are exploitative ...

—p.102 by Kai Brach 4 years, 8 months ago

As designers, how can we convince business minds that what works isn't always what's right?

[...]

[...] Companies are made out of people and we are their most valuable assets. They can't operate without us. So, because Uber was no longer a cool place to work and people were leaving, that changed who ran that company.

As humans, we feel small sometimes against something the size of Google. By voting with our feet, and refusing to work for companies that are exploitative, we can make really big differences. You refuse to do things that you believe have gone too far down the path of giving up what's right for what works.

cool. though, tbh, idk if it's possible to refuse to work for all companies that are exploitative ...

—p.102 by Kai Brach 4 years, 8 months ago
103

[...] Section 2030 from the 1996 Communications Act is a piece of deregulation that says that a platform isn't responsible for the content that a user posts.

I'd like to see it amended in one specific way. There are a lot of people posting content for which you can't necessarily make the platform responsible. But if the platforms were to make the curatorial choice to promote that content in a recommendation engine, and it reached a certain number of people, they would then have to be responsible for it. I think that would have the effect of curtailing, for example, YouTube's pushing of conspiracy theories by placing certain videos in their 'top trending' boxes. [...]

it's an algorithm, sure, but it's an algorithm that has at least some human curation, and could probably have more. have simple guidelines: dont promote it if it could be construed as hate speech or sth similar acc to some guidelines. it might not be perfect but it would sure as hell be better than now

and ofc the human curators should be paid very well, given lots of employment security and authority and guidelines and report to someone high up

something to think about: the dream is to get rid of human action entirely, but maybe that's a dumb dream? an impractical one? there'll always be human curation needed as long as human society and culture and interaction cannot be described/generated by a finite algorithm

—p.103 by Kai Brach 4 years, 8 months ago

[...] Section 2030 from the 1996 Communications Act is a piece of deregulation that says that a platform isn't responsible for the content that a user posts.

I'd like to see it amended in one specific way. There are a lot of people posting content for which you can't necessarily make the platform responsible. But if the platforms were to make the curatorial choice to promote that content in a recommendation engine, and it reached a certain number of people, they would then have to be responsible for it. I think that would have the effect of curtailing, for example, YouTube's pushing of conspiracy theories by placing certain videos in their 'top trending' boxes. [...]

it's an algorithm, sure, but it's an algorithm that has at least some human curation, and could probably have more. have simple guidelines: dont promote it if it could be construed as hate speech or sth similar acc to some guidelines. it might not be perfect but it would sure as hell be better than now

and ofc the human curators should be paid very well, given lots of employment security and authority and guidelines and report to someone high up

something to think about: the dream is to get rid of human action entirely, but maybe that's a dumb dream? an impractical one? there'll always be human curation needed as long as human society and culture and interaction cannot be described/generated by a finite algorithm

—p.103 by Kai Brach 4 years, 8 months ago
104

Is there any way to give ourselves antibodies versus targeted persuasion?

To me this is about using everything we know about design and human psychology to fight back. For example, Amazon famously found that for every hundred milliseconds more slowly their page loads, they lose one percent of revenue. So we know that attention is directly correlated to how fast apps or pages load. Why not just turn that back around and design it whereby the more you use something you know you don't want to use, you insert a longer and longer delay, and soon you'll just ask 'why am I here anyway?' It would give your brain the chance to catch up with its impulses.

[...]

—p.104 by Kai Brach 4 years, 8 months ago

Is there any way to give ourselves antibodies versus targeted persuasion?

To me this is about using everything we know about design and human psychology to fight back. For example, Amazon famously found that for every hundred milliseconds more slowly their page loads, they lose one percent of revenue. So we know that attention is directly correlated to how fast apps or pages load. Why not just turn that back around and design it whereby the more you use something you know you don't want to use, you insert a longer and longer delay, and soon you'll just ask 'why am I here anyway?' It would give your brain the chance to catch up with its impulses.

[...]

—p.104 by Kai Brach 4 years, 8 months ago