What if we let users opt out of accepting our cookies altogether? I liked that idea, but Marissa raised an interesting point. We would clearly want to set the default as "accept Google's cookies." If we fully explained what that meant to most users, however, they would probably prefer not to accept our cookie. So our default setting would go against users' wishes. Some people might call that evil, and evil made Marissa uncomfortable. She was disturbed that our current cookie-setting practices made the argument a reasonable one. She agreed that at the very least we should have a page telling users how they could delete their cookies, whether set by Google or by some other website.
the ringing of the evil detector is a symptom of a much earlier mistake
In mid-2003, Susan put some product plans and strategic documents on MOMA that required a password to access. She was concerned that the sales team might accidentally spill too much to clients. As head of product management, Jonathan told her to make the documents accessible because Google so strongly valued the free flow of information among staff members. Only performance appraisals and compensation were off limits. "This is extremely unusual for a company to do," Eric Schmidt often reminded us at our weekly TGIF meetings, "but we will continue trusting everyone with sensitive information unless it becomes a problem."
In September 2003, it became a problem. Information about our revenue numbers and Larry and Sergey's stock holdings started showing up in news reports. Eric immediately clamped down, telling Omid and me to stop including revenue numbers in TGIF presentations. Passwords on MOMA were no longer forbidden. It was a shame, Eric observed, that reality had finally come to Google.
The source for the stories turned out to be a low-level administrator feeding information to an outsider. She was asked to leave. In January 2004, though, long after that first small leak had been plugged, a much bigger crack appeared in our wall of secrecy. The same month we hired our first corporate security manager, John Markoff from the New York Times wrote a series of articles in which he reported details of products in development and the results of an internal audit conducted in preparation for a possible IPO. The information had been extremely confidential and closely held. The leak was ultimately traced to a senior manager who had known Markoff for years. He left the company as well, though the true reason for his departure was not made public, leading to much speculation.
From that point on, I had to ask for access to the project information I needed to do my job. It felt odd, as if with each ironclad, password-protected gateway the company installed, it locked out a little more of its original corporate culture.
Shortly before going public, Google clamped down completely. According to SEC rules, every employee who had access to intimate knowledge about the state of the business would be restricted from freely buying and selling the company's stock. I, and most others, gladly traded ignorance about our bottom line for the bliss of being able to cash out whenever we were ready to do so. The days of innocence in the garden of data had officially come to an end.
or just like make it all public idk
Google's obsession with metrics was forcing me to take stock of my own capabilities. What did I bring to the table? What were my limits? How did I compare? Insecurity was a game all Googlers could play, especially about intellectual inferiority. Everyone but a handful felt they were bringing down the curve. I began to realize how closely self-doubt was linked to ambition and how adeptly Google leveraged the latter to inflate the former—urging us to pull ever harder to advance not just ourselves but the company as a whole.
Toward the end of my Google run, a newly hired senior manager put into words what I had discovered long before. "Let's face it, Doug," he confided, "Google hires really bright, insecure people and then applies sufficient pressure that no matter how hard they work, they're never able to consider themselves successful. Look at all the kids in my group who work absurd hours and still feel they're not keeping up with everyone else."
Quietly, I say, "You know, that's what the Nazis did."
They all look at me in disgust. It's the look boys give a girl who has interrupted a burping contest. One says, "This is something my wife would say."
When he says "wife," there is no love, warmth, or goodness in it. In this engineer's mouth, "wife" means wet diapers and dirty dishes. It means someone angry with you for losing track of time and missing dinner. Someone sentimental. In his mind (for the moment), "wife" signifies all programming-party-pooping, illogical things in the universe.
Still, I persist. "It started as just an idea for the Nazis, too, you know."
The engineer makes a reply that sounds like a retch. "This is how I know you're not a real techie," he says.
ooof
To build such a crash-resistant system, the designer must be able to imagine - and disallow - the dumbest action. He or she cannot simply rely on the user's intelligence: who knows who will be on the other side of the program? Besides, teh user's intellligence is not quantifiable; it's not programmable; it cannot protect hte system. The real task is to forget about the intelligent person on the other side and think of every single stupid thing anyone might possibly do.
In the designer's mind, gradually, over months and years, there is created a vision of the user as imbecile. The imbecile vision is mandatory. No good, crash-resistant system can be built except if it's done for an idiot. The prettier the user interface, and the fewer odd replies the system allows you to make, the dumber you once appeared in the mind of the designer.
The designer's contempt for your intelligence is mostly hidden deep in the code. But, now and then, the disdain surfaces. Here's a small example: You're trying to do something simple, like back up files on your Mac. The program proceeds for a while, then encounters an error. Your disk is defective, says a message, and below the message is a single button. You absolutely must click this button. If you don't click it, the program hangs there indefinitely. [...] You must say, "OK."
relevant to PEBKAC
Ironically, those of us who most believe in physical, operational eloquence are the very ones most cut off from the body. To build the working thing that is a program, we perform "labor" that is sedentary to the point of near immobility, and we must give ourselves up almost entirely to language. Believers in the functional, nonverbal worth of things, we live in a world where waving one's arms accomplishes nothing, and where we must write, write, write in odd programming languages and email. Software engineering is an oxymoron. We are engineers but we don't build anything in the physical sense of the word. We think. We type. It's all grammar.
Cut off from real working things, we construct a substitute object: the program. We treat it as if it could be specified like machinery and assembled out of standard parts. We say we "engineered" it; when we put the pieces of code together, we call it "a build." And, cut off from the real body, we construct a substitute body: ourselves online. We treat it as if it were our actual self, our real life. Over time, it does indeed become our life.
A storm was coming in off the Pacific. The air was almost palpable, about to burst with rain. The wind had whipped up the ocean, and breakers were glowing far out from the beach. The world was conspiring around us. All things physical insisted we pay attention. The steady rush of the ocean. The damp sand, the tide pushing in to make us scuttle up from the advancing edge. The birds pecking for dinners on the uncovered sand. The smel of salt, of air that had traveled across the water all the way from Japan. The feel of continent's end, a gritty beach at the western edge of the city.
I feared for the health of my ENTER key. I looked for manuals: found none. Searcched for help disks: hiding somewhere in the mass of CDs Microsoft had relentlessly sent me. Two hours of pawing through stacks of disks. Horns of rush-hour traffic. Light fading from the sky. Disks tumbling to the floor.
[...] The mere impulse toward Linux had led me into an act of desktop archaeology. And down under all those piles of stuff, the secret was written: we build our computers the way we build our cities - over time, without a plan, on top of ruins.
pretty
An immense calm settled over the room. We were reminded that software engineering was not about right and wrong but only better and worse, solutions that solved some problems while ignoring or exacerbating others. That the machine the world wants to see as possessing some supreme power and intelligence was indeed intelligent, but only as we humans are: full of hedge and error, brilliance and backtrack and compromise.
Linus Torvalds talk about Linux design tradeoffs