diego's weblog

there and back again

remembering

“The movie will begin in five moments,” the mindless voice announced. All those unseated will await the next show. We filed slowly and languidly into the hall. The auditorium was vast and silent. As we were seated and were darkened, the voice continued.

“The program for this evening is not new. You’ve seen this entertainment through and through. You’ve seen your birth, your life and death. You might recall all the rest. Did you have a good world when you died?… Enough to base a movie on?”

ghost stories

You think it will never happen to you, that it cannot happen to you, that you are the only person in the world to whom none of these things will ever happen, and then, one by one, they all begin to happen to you, in the same way they happen to everyone else.

 Paul Auster, Winter Journal

Solipsistic. That’s the word I’m looking for.

We all experience solipsism at times, and consciously applied it can be a refuge, even if not quite a philosophy. A natural reaction, probably, at the attack on the ego that are shared experiences. After all, if nothing is real, nothing can hurt you. Right? Biographies can bridge the gap, to climb out of the hole, at least briefly. They’re also about ghosts, beyond the more classical definition of the word: the ghosts of who yourself and others were.

Ghost stories. The ethereal presence of past selves that hang around us, unbidden, unshakable.

I’ve been re-reading three this week. Two autobiographies, neither of which, perhaps appropriately, was written in the first person, and one biography.

The first is Winter Journal by Paul Auster. Maybe not his best work, but still worthwhile. Subdued, fragmentary. Nowhere near the power of The Invention of Solitude. Written in the second person, it feels disembodied at times even as he describes the physical in detail: “a catalog of sensory data,” he says at some point, and after all this is much of what consumes life, living, death, dying. The narrative nudges, rather than pushes, forward. It ends up feeling like a meditative exercise.

Joseph Anton: A Memoir, the second book, is on the other hand like being thrust into the edge of a tornado. You can see the calm center of the storm, integral to it but out of reach, as you spin wildly on its edges. This is may also not Rushdie’s best, but to place it against works of fiction, however autobiographically informed they may be, is a disservice in my mind. “Life and death” feels real in these pages, and I doubt any one of us could have done better at navigating the choices he faced. Fear is palpable, so is anger: he could have easily borrowed the title from his 2001 book: Fury. Writing it in third person as he did may have been the only way to frame these experiences.

Rushdie’s celebrity status is responsible for a lot of the negative reaction towards this book, but it’s an important work, and I tend to ignore what surrounds the celebrity obsession within the book, wives, girlfriends, meeting Bono… and focus instead on the struggle around the fatwa and The Satanic Verses. Self-publishing is revolutionary and is happening in this area is important and in any case would happen no matter what. But disintermediation can have the effect of, um, disintermediating and therefore exposing bare an artist, leaving them without a support structure. What would happen today, I wonder, if instead of principled editors and publishers all that stood between an artist and a murder proclamation was… the complaints department at Amazon?

I wonder.

This applies more broadly. The very force that gives everyone a voice may be also be empowering those who want nothing but to take our voices away (think China, or Iran, or Syria, or…).

Irony.

Which brings me to Every Love Story Is A Ghost Story: A Life of David Foster Wallace, by D. T. Max, and perhaps the best book I read last year. Here, finally, a biography written in third person about a third person. It could also have been subtitled “DFW’s Battles With Irony And Addiction,” although it didn’t deal exclusively with that of course, and I use the word “with” carefully here, since it doesn’t univocally mean against. What follows is a brief passage that illustrates well not only some of these ideas but also makes visible to different degrees strands that are woven throughout the book and the story, and DFW’s life.

America was, Wallace now knew, a nation of addicts, unable to see that what looked like love freely given was really need neurotically and chronically unsatisfied. The effect of Leyner’s fictional approach to life—mutated, roving, uncommitted—like that of Letterman and Saturday Night Live—was to make our addiction seem clever, deliberate, entered into voluntarily. Wallace knew better. And now he was far clearer on why we were all so hooked. It was not TV as a medium that had rendered us addicts, powerful though it was. It was, far more dangerously, an attitude toward life that TV had learned from fiction, especially from postmodern fiction, and then had reinforced among its viewers, and that attitude was irony. Irony, as Wallace defined it, was not in and of itself bad. Indeed, irony was the traditional stance of the weak against the strong; there was power in implying what was too dangerous to say. Postmodern fiction’s original ironists—writers like Pynchon and sometimes Barth—were telling important truths that could only be told obliquely, he felt. But irony got dangerous when it became a habit. Wallace quoted Lewis Hyde, whose pamphlet on John Berryman and alcohol he had read in his early months at Granada House: “Irony has only emergency use. Carried over time, it is the voice of the trapped who have come to enjoy the cage.” Then he continued: “This is because irony, entertaining as it is, serves an almost exclusively negative function. It’s critical and destructive, a ground-clearing….[I]rony’s singularly unuseful when it comes to constructing anything to replace the hypocrisies it debunks. That was it exactly—irony was defeatist, timid, the telltale of a generation too afraid to say what it meant, and so in danger of forgetting it had anything to say.”

D.T. Max., Every Love Story Is A Ghost Story

Life. Addiction. Irony. Death. There are no simple, clean, tidy answers, and fragmentary is an appropriately recurring idea.

DFW, commenting on Infinite Jest once said that the novel was “[…] sort of what it’s like to be alive […] really a very pretty pane of glass that had been dropped off the twentieth story of a building.”

Indeed.

snatched away

yiKnowing it’s coming doesn’t help. “Too soon,” you think, still. Faced with loss, your vocabulary feels incomplete; you are left grasping at metaphors. Even those don’t feel quite right. You discard them, reluctantly, all but one.

A life unfairly snatched away. Then again, when is it fair?

Too soon.

Yi. I’ll miss you.

We turn away 
to face the cold, enduring chill

As the day begs the night for mercy love
The sun so bright it leaves no shadows
Only scars
Carved into stone
On the face of earth
The moon is up and over One Tree Hill
We see the sun go down in your eyes

You run like a river, on like a sea
You run like a river runs to the sea

And in the world a heart of darkness
A fire zone
Where poets speak their heart
Then bleed for it
Jara sang – his song a weapon
In the hands of one
though his blood still cries
From the ground

It runs like a river runs to the sea
It runs like a river to the sea

I don’t believe in painted roses
Or bleeding hearts
While bullets rape the night of the merciful
I’ll see you again
When the stars fall from the sky
And the moon has turned red
Over One Tree Hill

We run like a river
Runs to the sea
We run like a river to the sea
And when it’s raining
Raining hard
That’s when the rain will
Break your heart

Raining…raining in your heart
Raining into your heart
Raining…
Raining your heart into the sea

Oh great ocean
Oh great sea
Run to the ocean
Run to the sea

assume good intentions

A good friend once told me: “Assume good intentions.” Those three words have been hugely influential in my world view in the last few years. Once you make this idea explicit it can shape how you think about what others do in significant ways.

I was reading today about some of the brouhaha surrounding Lean In and the whole why-is-a-billionaire-woman-telling-women-everywhere-what-to-do thing and there was a reference for the launch of Circles.

Gina & Team: congratulations on the launch, it must have been a crazy effort and it looks great.

It seems it’s been building up for a while (the controversy around the book, that is) but I had not seen it until today when I read this article in The New Yorker.

Why I bring this up is that what keeps coming back to me in all of this is how our perspective in the Valley is sometimes clouded by second-hand opinions, innuendo, and gossip, for example around who got funded by whom or which idea is “in”. Yes, this is not unique to the Valley, but it happens frequently here and so I can attest to it, in my own backyard (so to speak… the actual inhabitants of my shared backyard are bluebirds and squirrels).

Putting yourself out there, through a book, art, or even, yes, software, is a hard thing to do. People misunderstand and misinterpret your intentions and motivations constantly, and the schadenfreude that is sadly all-too-common makes things even harder. But we are all just people, trying to do the best we can. The number of significant zeros in your bank account doesn’t change that in most cases. And I say that  having very few significant zeros left in my own bank account.

But, funny thing (not ha-ha funny), most of the people that have such strong opinions on these things have never done them. They “talk about the book” without having “read the book.” (You really need to read The New Yorker article to get this reference). Some of my brothers-in-arms work at Evernote, but do they get press and coverage when they “just” keep an awesome service/app running? No. They get press when someone breaks into their systems.

Controversy sells.

Don’t get me wrong: critics are good> But it’s a matter of degrees. I’m not saying you need to write a book to be able to critique a book, or that you need to start a company to be give your opinion on how ist should be run, but at the very least spend a moment and consider the effort involved. Avoid ad hominems. Forget about money for a second. Consider how much of their lives these people are sacrificing trying to do something.

Assume good intentions.

I bet that if you did that you’d find yourself a bit more forgiving of missteps, a bit more understanding, a bit more willing to believe.

And for those who are doing it, regardless of the scope or (apparent) size of your project, here’s something I could not say out loud because it would sound terrible given my accent… but I can write it: Gina, Sheryl, and all of you out there who are putting yourselves, your sanity, on the line for an idea: Give ‘em hell.

:-)

honestly, let’s unpack this: it’s like, you know…very unique?

I am fascinated by (obsessed with?) slang, colloquialisms, jargon, argot, and of course language use and misuse in general. Perhaps most entertaining are slang and colloquialisms that pop up and become widespread in the space of a few years.

“Honestly…,” “Let’s unpack this,” and a few notable others have become more frequent (at least from my point of view) and I wanted to dissect them a bit and think about what could be behind them.

New terms or ways of communicating can be hard to see “appear” sometimes, since they enter everyday language incrementally, and the best part is that some of them may not be new at all, “new” defined here as “having popped up in the last 10 years”, but they may be new to me as they become common or even pervasive in the conversations I have and the information landscape that I inhabit.

There’s more than pure nerdish entertainment to this. For one thing, it can be used as a lens through which to look at society and culture, but more specifically at organizations and what makes them tick. Religions, in particular are an interesting subtype of organization since some of them maintain their high-level structures for hundreds or thousands of years. For example, Scientology’s obsession with redefining  language is notable in that they are at the extreme end of the spectrum combining both jargon and and repurposing of common language, which naturally affects how you communicate and therefore relate to, and to some degree how we perceive, reality.

Startups go through a similar (even if simultaneously more overt and less structured) process in this regard. Most of us have seen how companies have their own terminology for everything. In engineering, in particular, you could literally sit through an entire conversation about infrastructure between two engineers from the same company and never know what they’re talking about, while in marketing or sales they don’t so much invent terminology as repurpose it freely, leading to a overloading of commonly used terms that can some times create confusion (e.g. “Active users” or even “pageviews”).

I’m not saying that startups, tech companies, or even non-tech companies are cults (Apple’s perception as such notwithstanding..), but there’s some similarities that I think speak to a need of a group, no matter of what kind, to define itself as separate from everyone else and, of the mechanisms necessary for that to happen, language is one of the  easier starting points.

But back to what are more widely shared colloquialisms and/or slang, here’s a few personal favorites that I’ve observed have become more common in recent years, and some of my own musings on what’s behind them.

Some of these trigger “old man yells at cloud” syndrome in me, since (apparently) I have a hard time handling the cognitive dissonance, sheer nonsense, or just plain lack of meaning involved.

“Like, you know…” and the invisible question mark that follows

This one is fairly established, dating I think back to the mid-90s. And it hasn’t just endured, it has become so widespread and entrenched that it’s definitely worth mentioning.

It’s one of the most fascinating colloquialisms in my opinion. It’s a simile in which the structure that follows “like” is not explicit, but rather vaguely points to some idea that perhaps, maybe, hopefully, the other person shares in some indeterminate way in the statement we’re about to make, while expressing that we really don’t care too much one way or the other.

It is maddening to me to be in a conversation in which the other person constantly trails off, attaching “like, you know”s and question marks at the end of sentences. We are, apparently, not supposed to have conviction anymore, and language tinted with this construct communicates that clearly. It says: I have nothing invested in this statement.

All too often, in fact, “Like, you know…?” has no follow up at all and it just trails off, the question mark implicit in the inflection of our voice, the interrogative tone, the you know parenthetical. It’s filler, pretending that you’re saying something when you really aren’t, a statement without content, a commitment to nothing in particular that nevertheless creates the impression that we’re communicating. Whatever is said gets turned into a question, something to be challenged on the receiving end. But when the receiver also answers with similar lack of definition, then it’s just a bunch of words strung together, isn’t it? A charade: because, actually, we don’t want to have a real conversation.

Declarative language, straight up statement of beliefs, of facts, of what we know to be true even if it is subjective, has been appropriated by the extremes, the Glenn Becks of the world. The alternative, nuance and complexity of thought, are in everyone else often replaced by a quivering indecision.

The flip side of this indecision is how we pretend to counteract it with an earnest declaration: “Honestly…”

“Honestly”

This type of preface or clarification instantly triggers, at least for me, the thought that the rest of what the other person’s been saying has not been “honest.” Not “dishonest” necessarily, but the addition raises the level of whatever comes after over what came before. And, when it’s used constantly it just makes me question everything.

Aside from combining it with “like, you know…”, to give the appearance of weight while simultaneously reducing the importance of what we’re saying, “honestly” is also used in many other cases. Why are we suddenly using this modifier so frequently? Is it that in a world when The Onion‘s headlines can appear as serious as those in The New York Times we have suddenly decided that, by default, everything is suspect? Or is it perhaps that PR, marketing and advertising are so pervasive that we look at everything with skepticism and some degree of mistrust, requiring the additional emphasis of “honestly” to separate what we say from what we’re supposed to say? Maybe a bit of both. Ironically, advertising continues to be pretty effective. Instead of applying these filters to ads, we look at everything with suspicion.

“Let’s unpack this”

This one seems to have become more common in the last couple of years. I don’t know if it’s been traced back to its origins, but it seems to me that it’s a byproduct of technology –both explicitly and implicitly, partially around lack of trust, but also increased (real and perceived) complexity.

Explicitly: software, first, where so many things are “packaged” and have to be “unpacked” to look at them. More importantly, thanks to e-commerce, followed by a relatively new phenomenon of boxes everywhere. We all get packages at home or the office that have to be unpacked. Think back, pre-e-commerce, how common was it to get a package? For most people, not very. Now, unpacking is a frequent action in our daily lives, a common occurrence.

Implicitly: everything around us now has layers within layers, a Matryoshka doll of seemingly neverending complexity. The phrase “let’s take a look under the hood” used to be applicable beyond cars — the world generally had one level. You’d open the hood and there was the engine. Done. Now, “under the hood” is just the first of many layers, even in cars (batteries, microprocessors, software…). A phone is no longer just a phone, and you can even have a phone built into your car, nevermind connected to it. A car contains maps. The maps contain reviews. The reviews link to social media. And on and on it goes. The ongoing merging of cyberspace and meatspace often leads us down rabbit holes in everything we touch.

Which also relates to “Honestly” since “Unpacking” is often used for discussing statements by public officials, and even facts. The only way you would need to “unpack” a statement is if its true meaning, or different interpretations, were “packed” under the “wrapping” of its surface. Orwell’s doublespeak (or maybe n-speak?) ingrained to the degree that the default assumption becomes that there’s hidden meaning, or inherent obfuscation. Hence, “Honestly” may be functioning as a vaccine for “Unpacking” — something that communicates “Unpacking not required.”

“Very unique”

Once more, I chalk this one up as trying to counteract the lack of trust we have come to assume in what’s communicated. It is more commonly used by marketing types, but recently I’ve heard with alarming frequency in other contexts.

Something is either unique, or it isn’t. It can’t be “very unique,” or “incredibly unique.” Period. But I suppose that when words like “unique” have become overused, we start to add adjectives in the hopes of making it clear that, yes, this is unique, as opposed to all those other things that we say are unique even if they’re not.

This is the most egregious misuse of an adjective, but there are others. I typically use words like “beautiful,” “love,” “hate,” and others sparsely, because their weight is diminished by attaching them to everything. I like rain, in itself but also because I appreciate sunny days more when they’re juxtaposed with the alternative, and viceversa.

If everything is beautiful, if beautiful is the norm, then how do we talk about something that is special, that touches us beyond that? We start adding superlatives: “incredibly beautiful,” “profoundly beautiful” and so forth (“profound” is another overused term these days, now that I think about it). Until that becomes the way we refer to even the menu transition of an iPhone app, or some icon, or the color of a couch, at which point we are left with a situation in which our depiction of it leaves us little room to enjoy the occasional good thing, because we have done away with contrasts by turning everything into positive happy feelings.

Most of the time, nothing remarkable happens, our lives are routine, and that should be just fine. Also, a lot of things just suck. And that’s a good thing, because if they didn’t we wouldn’t be able to tell when they don’t.

On Borges and Languages, or, On rigor in translations

In the process of writing something else I wanted to use a quote from Jorge Luis Borges’ short story Del rigor en la ciencia. I ended up doing my own translation of it, and it seemed worthwhile to document why. (Note: I will use italics for Spanish words throughout the text, for clarity).

This short story (quite short actually, less than 130 words), was first collected in Historia universal de la infamia (“A universal history of infamy) and later in El Hacedor (“The Maker”). It is of the “recovered text” genre, supposedly dating to the year 1658.

The English translation quoted most frequently is by Andrew Hurley (Collected Fictions, Penguin, 1998). Hurley translates the title as “On Exactitude in Science” and that’s where my disagreements with his version begin.

First, the word “Rigor” from the Spanish title is translated by Hurley as “Exactitude.” However, “Rigor” (which is spelled the same in English and Spanish) is more than just “Exactitude.” The Oxford English Dictionary defines rigor as “the quality of being extremely thorough and careful; severity or strictness; (rigors) harsh and demanding conditions,” which is roughly equivalent to the definition of the Spanish word by the Real Academia Española (although the Spanish word includes other meanings that are not exactly the same, but closely related to the ones used by the OED).

Second, the word “Exactitude” exists in Spanish: “Exactitud.” Borges would have used it if that’s what he wanted to convey.  The structure “Rigor en…” is frequently used in Spanish, and in this case it actually conveys accurately the Latin cultural perception of Science as being not just exact but also strict, even severe, a perception that is far more muted, if at all present, in Anglo-Saxon cultures. The argument using “Exactitude” could be that “rigor in science” would be a somewhat archaic phrasing, but this is actually something that works to our advantage given the supposed origin of the text in the 17th century.

Third, Hurley also capitalizes the words “Exactitude” and “Science” in the title, whereas the original Spanish text does not. This matters because in this particular story Borges actually turned several words into their “proper” form (e.g. Nouns into Proper Nouns), using the effect of capitalization to expand the importance of those words. Critically, this use of capitalization places the text in a historical context — use of capitalization was not generally properly codified prior to the 18th century in either English or Spanish (for just one example of archaic use of capitalization in english, see George Washington’s “Rules of Civility,” starting with Rule 1: “Every Action done in Company ought to be with Some Sign of Respect, to those that are Present.”)

Within the text, other differences in tone and depth of meaning become visible:

Spanish Original: “Con el tiempo, estos Mapas Desmesurados […]”

Hurley’s translation: “In time, those Unconscionable Maps […]”

My translation: “In time, these Excessive Maps […]”

To start, “estos Mapas” is “these Maps,” not “those Maps.” While I can see why Hurley would choose “those” here, I have no doubt that Borges would have used “esos” or “aquellos” if his intention was to say “those”. Then there’s the translation of “Desmesurados” as “Unconscionable” which on one hand captures some of the feeling of the Spanish word but not all. Lacking context “Desmesurados”  means “Without Measure,” but in this context I’d actually say that “Unconscionably Excessive” or “Unmeasurably Excessive” is probably the most accurate reflection what Borges was going for. I ended up using only “Excessive,” exchanging brevity for lack of verve and depth. 

Another example can be found in the following partial sentence:

Spanish Original: “Menos Adictas al Estudio de la Cartografía, las Generaciones Siguientes entendieron que ese dilatado Mapa era Inútil […]”

Hurley’s translation: “The following Generations, who were not so fond of the Study of Cartography as their Forebears had been, saw that that vast map was Useless […]”

My translation: “Less Addicted to the Study of Cartography, the Following Generations understood that that dilated Map was Useless […]”

Hurley here introduces another proper noun (“Forebears”) which isn’t even included in the original, and demotes “Following,” by reversing the capitalization of the word. In the process he affects, in my view, the weight implied by “Generaciones Siguientes” and not in a good way (he makes this mistake in the opposite direction with the word “Relic” in the last sentence).

Hurley translates “entendieron” as “saw” but I think there’s no reason to avoid the direct translation “understood” since it maintains the implication of understanding not just as the cognitive process but also as seeing or realizing something. Hurley’s addition of a comma before “saw” also affects the pace of the sentence for no good reason.

He also changes “Adictas” (“Addicted”) to a much more mellow “not so fond of” from the original, much harsher implication of addiction (with a capital “A” no less!). If we recontextualize the change into a more common setting we can see the damage this causes to the text. Compare “Joe was less Addicted to heroin after that” to “Joe was not so fond of heroin after that.”

I did have some qualms about using “dilated Map” here instead of Hurley’s “vast Map” but once more I defer to Borges on this. Using “dilated” for a map (“ese dilatado Mapa”) is pure literary license and not the way in which you’d ascribe vastness to a map either in English or in Spanish, so there’s really no reason not to use the English word (“dilated”) that is the exact translation of the Spanish text to maintain the mental image that Borges was going for.

There are other specific changes I made, but need to work on other things. So, without further ado, here are the original, my translation, and Hurley’s for comparison.

Spanish Original

“Del rigor en la ciencia”, by Jorge Luis Borges

. . . En aquel Imperio, el Arte de la Cartografía logró tal Perfección que el Mapa de una sola Provincia ocupaba toda una Ciudad, y el Mapa del Imperio, toda una Provincia. Con el tiempo, estos Mapas Desmesurados no satisficieron y los Colegios de Cartógrafos levantaron un Mapa del Imperio, que tenía el Tamaño del Imperio y coincidía puntualmente con él. Menos Adictas al Estudio de la Cartografía, las Generaciones Siguientes entendieron que ese dilatado Mapa era Inútil y no sin Impiedad lo entregaron a las Inclemencias del Sol y los Inviernos. En los Desiertos del Oeste perduran despedazadas Ruinas del Mapa, habitadas por Animales y por Mendigos; en todo el País no hay otra reliquia de las Disciplinas Geográficas.

Suárez Miranda: Viajes de varones prudentes, libro cuarto, cap. XLV, Lérida, 1658.

My English Translation

On rigor in science

. . . In that Empire, the Art of Cartography attained such Perfection that the map of a single Province occupied an entire City, and the map of the Empire, an entire Province. In time, these Excessive Maps did not satisfy and the Schools of Cartographers built a Map of the Empire, that was of the Size of the Empire, and which coincided point for point with it. Less Addicted to the Study of Cartography, the Following Generations understood that that dilated Map was Useless and not without Pitilessness they delivered it to the Inclemencies of the Sun and the Winters. In the Deserts of the West endure broken Ruins of the Map, inhabited by Animals and Beggars; in the whole country there is no other relic of the Disciplines of Geography.

Suárez Miranda: Viajes de varones prudentes, libro cuarto, cap. XLV, Lérida, 1658.

Andrew Hurley’s English Translation, in Collected Fictions, Penguin, 1998.

On Exactitude in Science

. . . In that Empire, the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City, and the map of the Empire, the entirety of a Province. In time, those Unconscionable Maps no longer satisfied, and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire, and which coincided point for point with it. The following Generations, who were not so fond of the Study of Cartography as their Forebears had been, saw that that vast Map was Useless, and not without some Pitilessness was it, that they delivered it up to the Inclemencies of Sun and Winters. In the Deserts of the West, still today, there are Tattered Ruins of that Map, inhabited by Animals and Beggars; in all the Land there is no other Relic of the Disciplines of Geography.

- Suárez Miranda, Viajes de varones prudentes, Libro IV, Cap. XLV, Lérida, 1658

the pliability of our perception of time

Me: Happy New Year!

World: It’s February.

Me: Says you.

World: Says me, the Gregorian Calendar and over six other billion on this planet. I’d add “Q.E.D.” but it seems redundant.

Me: Do I get points for trying?

World: Not really, no.

Me: Fine. I accept your apology.

World: I didn’t apologize.

Me: I know you are, but what am I?

World: I’d rather not say.

Me: I win!

kindle paperwhite: good device, but beware the glow

For all fellow book nerds out there, we close the trilogy of kindle reviews for this year, now moving on to a look at Kindle Paperwhite, adding to the plain Kindle review and the Kindle Fire HD.

This device has gotten the most positive reviews we’ve seen this side of an Apple launch. I don’t think I’ve read a single negative review, and most of them are positively glowing with praise. A lot of it is well deserved. The device is light, fast, and the screen is quite good. The addition of light to the screen, which everyone seems bananas about, is also welcome, but there are issues with it that could be a problem depending on your preference (more on that in a bit).

A TOUCH BETTER

Touch response is better than the Kindle touch as well. There are enough minor issues with it that it’s not transparent as an interface — while reading, it’s still too easy to do something you didn’t intend to do (e.g. tap twice and skip ahead more than one page, or swipe improperly on the homescreen and end up opening a book instead of browsing, etc.) but it doesn’t happen so often that it gets in the way. Small annoyance.

Something I do often when reading books is highlight text and –occasionally– add notes for later collection/analysis/etc. Notes are a problem in both Kindles for different reasons (no keyboard in the first, slow-response touch keyboard in the second) but the Paperwhilte gets the edge I think. The Paperwhite is also better than the regular Kindle for selection in most cases (faster, by a mile), with two exceptions being that at the end of paragraphs it’s harder than it should be to avoid selecting part of the beginning of the next, and once you highlight a the text gets block-highlighted as opposed to underlined, which not only gets in the way of reading but also results in an ugly flash when the display refreshes as you flip pages. Small annoyances #2 and #3.

Overall though, during actual long-form reading sessions I’d say it works quite well. Its quirks appear of the kind that you can get used to, rather than those that you potentially can’t stand.

THE GLOW THAT THE GLOWING REVIEWS DIDN’T SPEND MUCH TIME ON

Speaking of things you potentially can’t stand, the Paperwhite has a flaw, minor to be sure, but visible: the light at the bottom of the screen generates weird negative glow, “hotspots” or a kind of blooming effect on the lower-screen area that can be, depending on lighting conditions, brightness, and your own preference, fairly annoying. Now, don’t get me wrong — sans light, this is the best eink screen I’ve ever seen, but the light is on by default and in part this is a big selling point of the device, so this deserves a bit more attention.

Some of the other reviews mention this either in passing or not at all, with the exception of Engadget where they focused on it (just slightly) beyond a cursory mention.

Pogue over at the NYT:

“At top brightness, it’s much brighter. More usefully, its lighting is far more even than the Nook’s, whose edge-mounted lamps can create subtle “hot spots” at the top and bottom of the page, sometimes spilling out from there. How much unevenness depends on how high you’ve turned up the light. But in the hot spots, the black letters of the text show less contrast.

The Kindle Paperwhite has hot spots, too, but only at the bottom edge, where the four low-power LED bulbs sit. (Amazon says that from there, the light is pumped out across the screen through a flattened fiber optic cable.) In the middle of the page, where the text is, the lighting is perfectly even: no low-contrast text areas.”

The Verge:

“There are some minor discrepancies towards the bottom of the screen (especially at lower light settings), but they weren’t nearly as distracting as what competitors offer.”

Engadget:

“Just in case you’re still unsure, give the Nook a tilt and you’ll see it clearly coming from beneath the bezel. Amazon, on the other hand, has managed to significantly reduce the gap between the bezel and the display. If you look for it, you can see the light source, but unless you peer closely, the light appears to be coming from all sides. Look carefully and you’ll also see spots at the bottom of the display — when on a white page, with the light turned up to full blast. Under those conditions, you might notice some unevenness toward to bottom. On the whole, however, the light distribution is far, far more even than on the GlowLight.”

So it seems clear that the Nook is worse (I haven’t tried it) but Engadget was the only one to show clear shots of the differences between them, although I don’t think their screenshots clearly show what’s going on. Let me add my own to that. Here’s three images:

 

The first is the screen in a relatively low-light environment at 75% screen brightness (photo taken with an iPhone 5, click on them to see them at higher res). The second two are the same image with different Photoshop filters applied to show more clearly what you can perhaps already see in the first image — those black blooming areas at the bottom of the screen, inching upwards.

The effect is slightly more visible with max brightness settings:

What is perhaps most disconcerting is that what is more visible is not the light but the lack of it — the black areas are what’s not as illuminated as the rest before the full effect of light distribution across the display takes place.

Being used to the previous Kindles, when I first turned it on my immediate reaction was to think that I’d gotten a bad unit, especially because this issue hadn’t been something that reviews had put much emphasis on, or seemed to dismiss altogether, but it seems that’s how it is. Maybe it is one of those things that you usually don’t notice but, when you do, you can’t help but notice.

So the question is — does it get in the way? After reading on it for hours I think it’s fair to say that it fades into the background and you don’t really notice it much, but I still kept seeing it, every once in a while, and when I did it would bother me. I don’t know if over time the annoyance –or the effect– will fade, but I’d definitely recommend you try to see it in a store if you can.

THE REST

Weight-wise, while heavier than the regular Kindle, the Paperwhite seems to strike a good balance. You can hold it comfortably on one hand for extended periods of time, and immerse in whatever you’re reading. Speaking of holding it — the material of the bezel is more of a fingerprint magnet than previous Kindles, for some reason, and I find myself cleaning it more often than I’ve done with the others.

The original touch was ok but I still ended up using the lower-end Kindle for regular reading. If I can get over the screen issue, the Paperwhite may be the one touch e-reader to break that cycle. Time will tell.

short answer yes with an if, long answer, no, with a but…

Part 3 of a series (Part 1, Part 2)

HERE WE GO AGAIN

I will look at this from one more angle and then I will let it rest here for future reference, since pretty much everyone else seems, not surprisingly, to have moved on. With the aside on how we go about discussing this topic out of the way (and various other digressions) in my post last Sunday, I wanted to focus a bit on what is perhaps the center of the argument used in the Times article. At the very least, elaborating on the flaws the center should get us very close to exposing the feebleness of the rest of the argument’s construction.

At the core of the argument is the following paragraph:

Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.

In my response I took issue with this paragraph in two specific areas: 1) that an average is meaningless without more information (e.g. the standard deviation, for starters) and 2) that the measure of utilization they imply, and I say imply because they never make it clear –another flaw–, was one of “performing computations.” I elaborated a bit on the various types of tasks servers may be performing and noted that you couldn’t amalgamate them all into a single value. This is true, but I want to step back a bit into what utilization means and the unmentioned factor that is on the other side of it: efficiency.

WHERE IT BECOMES CLEAR THAT TERMINOLOGY MATTERS (A LOT)

Semantics time: we need to define some terms. Let’s say, just for a moment, to simplify the discussion a bit, that we’re ok talking about utilization as some kind of aggregate. Let’s say that we ignore the specifics of what the servers are doing and we further assume that we will use a percentage of utilization of the “system,” to some percent between 0 and 100, with zero being the system is doing nothing but its own minimal housekeeping tasks, so zero isn’t really zero, but we’re simplifying, and 100 being the system is fully utilized doing something specifically related to the application at hand. I should start though, with some terminology housekeeping by defining what “a system” is.

Definition 0: I will use the words system and machine interchangeably to mean a particular piece of hardware or virtual instance, typically a server, running a particular piece of software. Mixing up virtual systems and machines with actual hardware is a bit of a shortcut, but since in practice virtual systems must eventually map to a real one, it’s one that I think we can live with. (Another shortcut lies in “typically a server” since, say, network switches should also be part of the equation.)

Definition 1: Utilization: a percentage between 0 and 100 of system load related to the specific application tasks at hand.

I shudder at the oversimplification, but I’ll get over it. Probably.

Now, related to utilization is efficiency. While utilization can be said to be an objective concept that is measurable, efficiency can only really be understood relative to something else, and in the case of the a piece of software as relative to previous versions itself. So for example we can’t really say anything reasonable about the efficiency of a V1 piece of software except with respect to imagined possible changes in the future. Conversely, the efficiency of V2, V3, etc will be defined in relation to whatever version or versions preceded them. Now that we have the definition of utilization, though, we can talk about efficiency in terms of that. So, for example, if V2 uses half the systems that V1 used, then it’s twice as efficient (2x). Or V2 could require 20 machines while V1 required 10, in which case V2 is half as efficient.

Definition 2: Efficiency: the change in utilization between two versions of the system.

Once again — for all data center people out there, I’m oversimplifying for the sake of this particular argument.

In the paragraph quoted from the Times above there’s a bit of a jumble of terms. It starts talking about “energy efficiency” (which, in our definition, would be 100%-Utilization%) and then it talks about using “they were using only 6 percent to 12 percent” which is straight utilization%. I think playing fast and loose with terminology like that can get in the way of really knowing what we’re talking about, which is why I’ve spent some time defining, at least, what I’m talking about.

INSERT OBVIOUS TRANSITIONAL SECTION TITLE HERE

Ok, fine, the squirrel in charge of coming up with section titles is on a break and I can’t think of a good one, so let’s just keep going now that we’re armed with these terms. I’ve repeatedly stated that the average utilization, by itself, is meaningless. Allow me to elaborate on one key reason why. Suppose you measure ten systems and get the following utilization values: 10, 10, 10, 10, 10, 90, 90, 90, 90, 90. This gives you an average utilization of 50% — already something to note, since the average clearly isn’t telling the whole story. Assuming for a moment that these values are comparable across systems (a big if, but again: simplify!), the average is really hiding something important: five systems have “bad” utilization (10%) while the other five have “good” utilization (90%).

But wait, why am I using quotes around “good” and “bad”? Because this is something the article implies: that more utilization is better, but this is exactly why talking about efficiency matters. Maybe, just maybe, the five systems at 10% are actually systems with software that do the same thing but more efficiently, and from that perspective our assessment of good or bad can get inverted.

Suppose the five systems at 90% are part of a cluster. Suppose one of the five system crashes. Suddenly the load that went to that system has to be distributed to all the others and it quickly puts the remaining four systems at over 100% and the entire cluster could crash. The load would never get distributed perfectly across systems among other problems — in reality we’d probably looking at a cascade of failures as each individual remaining system crosses the 100% threshold, putting more load on others, and so forth, ending with the final state where the entire cluster is down and everyone, from the CEO to your users, is screaming bloody murder.

Suddenly we’re looking at those systems at 90% with some suspicion, no? In fact if you keep on the simplification vibe, you could argue that for a five-system cluster the only way to protect it from any one machine going down (assuming a crash at 100% load, also not a given…) is to maintain the average utilization at around 75% or so, which in the event of one system crashing would leave the remaining four at 93.75%. 80% average would mean one system crashing leaves everyone at 100%. So the difference between 75% and 80%, which seems minimal, is the difference between life and death in this scenario.

But I said can get inverted because there’s other factors at play. Take just one: uptime requirements. Suppose you’re somehow OK with the idea that one system crashing takes everything down, and you’re willing to trade uptime for costs (ie., using only five machines). Then you’d be ok, probably. Everyone I know, however, would want to protect from this eventuality.

This scenario isn’t contrived to get the answer to come up the way I want. It is typical, certainly far, far more common than a simple and straightforward DC setup components get swapped instantly, you’re always at maximum efficiency, and every system does the same thing. Reality isn’t like that. Speaking of which…

A (SMALL) DOSE OF REALITY

Let’s complicate things a bit further, or rather, make them just slightly more realistic. Let’s say that your 90% utilization was measured during a time where you had 100 simultaneous users on average. Tomorrow, though, Pando Daily posts a glowing review of your website and usage doubles. Doesn’t sound too far-fetched. Now what? The 90% utilization, if there’s a strong correlation between simultaneous users and load (which is typical), is suddenly guaranteed to bring you down. Suddenly you’d need much lower baseline utilization to be able to handle that spike, and the 90% looks like a bad idea once again.

Going a bit further, imagine the two clusters of five systems are actually performing the same function, but one of them is a V1 (90%) and the other one is the more efficient version you just deployed (10%). This is something that happens all the time. As you can instrument a piece of software running under real-world load, you can understand better what leads to that load, you can optimize your software, and sometimes massive jumps in efficiency are common. So when the 90% is deemed as a problem, the team gets to work, and they come up with a V2 that is around an order of magnitude more efficient.

Which leaves you with two possible versions of the software doing the same work, but one uses 10% of the systems and the other 90%. Which one is better? I think everyone would agree that the newer, more efficient system is better, even though if you deployed it, it would suddenly make your utilization plummets across the board, which, according to the article, is “bad”. Oops.

Hold on, I hear you say. Now that you’ve got software that is more efficient why don’t you just decommission the extra systems you don’t need? Then you’d be using less power overall and you could cut down, say, from 10 machines at 90% to 5 machines at 20%, which should give you a nice margin for error.

Aha! This surely sounds true, but this is also where the oversimplifications I keep bashing get tricky since real world gets in the way. Two interrelated points. First, there aren’t that many (if any) tidy switchovers from V1 to V2. In increasing efficiency you may have introduced bugs. To protect against that, you start having to test (more machines for that!), and even when testing says it’s ok to deploy you will start small — deploy to a few machines only, wait, verify. Deploy a bit more. Wait, verify.

THE MISSING VARIABLE: TIME

The process we just described has an important variable that we haven’t looked at so far: time. As in, in the process of doing this, time passes.

Sounds obvious right? But if this is true, it’s also true that as time passes, requirements change. Requirements, here, encapsulating all the factors, external and internal, that go into delivering your service or product. Whatever you’re doing, you’re not dealing with a static entity, but something that is constantly evolving, both because you’re changing it from inside as you update the software, fix bugs, evolve architecture, and deploy new hardware, but also because the external factors are constantly in flux. Likely, you will now have more people using the systems (or, for things that are just APIs, maybe more machines). Or load may have changed due to a feature.

The passage of time here is the critical element missing, and it gets in the way of the ideal scenario in which we allow the system to truly “contract” and use fewer resources in terms of power. By the time you’re sure you can decommission those extra machines, it’s quite possible that you now have other uses for them, and even if you did find a sliver of time to do this you may not have the option since system growth, or feature changes, or whatever, may be telling you that you will need those extra machines in a week or a month month, and therefore taking them down would be simply a waste of time. At Ning, for example, our overall hardware footprint remained largely stable in terms of number of machines, and therefore total power consumed, even as we went from 1 MM registered users to 50 MM and beyond. Setting aside the fact that this took an enormous amount of work and constant vigilance, it also meant that the system that could handle 50 MM users with the same amount of hardware as for 1MM was very different than the original. And throughout that process, many machines would oscillate in their degree of utilization, from the low teens to the high 70s. Over a period of a few years you could take measurements at different points in time that would either make us appear either like geniuses or criminally stupid — if all you looked at was utilization, and if “less utilization is bad” was your only guiding principle.

If we can agree that low or high utilization alone is meaningless, and that your utilization will fluctuate, maybe drastically, as you optimize, we can start to ask more appropriate questions. For example: can we do better in terms of releasing idle capacity as efficiency increases? Absolutely, and the widespread use of virtualization in recent years also means that it’s now far easier to have capacity fluctuate along with load. The APIs for system management popularized by public cloud infrastructure companies (e.g. EC2, Rackspace cloud, etc.) have led in the last couple of years to more and more services where capacity is instantiated on demand according to load, leading to a more efficient use of those virtualized resources.

Even there, though, we have a problem, for if EC2 allows you to instantiate and tear down a hundred AMIs without giving it a second thought, it’s also necessarily true that there must be an actual hardware footprint doing nothing but waiting around for that to happen. Having people constantly disconnect and reconnect machines is something that is not just not feasible, and that in almost every case will involve as much waste as taking a system down, since in the world we live in the capacity we need to run the ever-increasing complexity of Internet infrastructure keeps going up, which means that whatever you stop using today, you’re guaranteed to need tomorrow. With 60% or more of humans the planet still not online, there’s clearly still a lot of growth left. For the biggest services, it’s common to have to constantly be deploying new capacity just to account for growth.

This leads us to yet another question, perhaps the last one in this particular chain of thought: Should we in general accept less reliability from online services given that it has real environmental impact, and real cost?

My own answer to this would be a clear NO. I don’t think you can have these systems simultaneously be part of the fabric of society (a point I’ve made before) and have them be “partially reliable,” just like there’s no way to be “partially pregnant.” Reliability is intrinsically tied to the usefulness of these services. Perhaps there are ways in which we can bake in more asynchronous behavior in some cases, but when a lot of what systems do is real-time, 24/7/365 and worldwide, this isn’t something we’ll be able to exploit frequently. We have crossed the Rubicon, so to speak, and have to see this through.

THE CONCLUSION, OR, BELATEDLY MAKING SENSE OF THE TITLE OF THIS POST

Utilization in data centers is an important issue, but talking about it bereft of context is not really that useful. In particular, without also talking about efficiency, and all the parameters that go into it including what kinds of applications are running, what the goals are, what the requirements are, etc., is going to leave us with nothing but incomplete answers, and here incomplete will leave us way too close to incorrect for comfort.

And this isn’t just about the Internet. Is it a valid question to talk about utilization in, say, TV stations? Or other major source of media for that matter? Pre-digital music distribution…?. Sure it is, but it’s not something we focus on because the use of energy in other media is a one or two steps removed from what we see, so it’s easier to ignore even if it’s there all the same. When was the last time you remember a TV station couldn’t broadcast? Are we to believe that they never had a power failure? No. They have backup systems. Those evil-sounding lead batteries or diesel generators.

Context.

By looking at context and simply shifting the assumption that Internet infrastructure is actually run fairly well, given all the requirements and its rapid evolution, we realize that what we really should be wondering about what makes us build systems this way, rather assuming they are not built properly. Should we talk about utilization? Or is this really about what drives it, and therefore utilization is part of the discussion but not the central point?

Channeling Reverend Lovejoy for a moment, we could say, then: “Short answer yes with an if, long answer, no, with a but…”

santa claus conquers the martians

Part 2 of a series (Part 1Part 3)

PRELUDE

I’ve had a busy week, and have been trying to sit down and put together a followup to my response to the NYT’s article on data centers.

I write the title, and I soon as I do, my mind goes blank. I read the title again. What the hell was I thinking? I am looking at the screen, white space extends below the blinking cursor, mirrored by something somehow stuck in my head, alternating on/off, rumbling lowly like an idling engine: I swear I had a point.

So naturally I start to think that this, perhaps, should be the new title. Which, in the expected recursion path that would follow naturally ends up in another meta-commentary paragraph (also with a simile close to its ending), which I decide not to write. Recursion upwards, probably to conform with an implicit image of happiness we may or may not feel (or is in this case is really quite unwarranted and even more, even worse: unnecessary) but we should generally imply anyway, because these days if you’re not explicitly happy something must be wrong, and therefore it must be fixed. Neutral has become a bad state to be in, apparently, long after being “with us or against us” became a common way to think about nearly everything. No, recursion has no direction except, perhaps, into itself, but it now occurs to me that years of looking at function call stacks have trained me (hopelessly comes to mind, but that’s also not happy) to think of recursion as up or down, rather than, say, horizontally from right to left.

Fascinating, I know.

– oOo –

I will eventually get to Santa Claus and the Martians, but for the moment, back to the article.

The series was titled “The Cloud Factories”, and right there it broadcast ever-so-subtly that it was to be something intended to get worked up about.

“Factory” can mean “the seat of some kind of production” but in this case the weight of the word is in the manufacturing angle. This doesn’t quite feel right, though. A factory is where things are built, sequentially, or at least mostly sequentially, and a cloud is anything but built, and the process is anything but sequential. A cloud emerges, and if we switch to the definite article and the proper noun with all its implications and uppercaseness, it’s also true that The Cloud is an emergent phenomenon. Metaphors are often misapplied, can be incorrect, but it’s not that often that a metaphor involving an overloaded term (“cloud”) is both misapplied and incorrect in the exact same way for nearly all the meanings of the term. This takes some skill.

So, yeah, the point of the title of the series was not to be accurate as an analogy, but to evoke. Specifically, an image. Much like the factory in which they make Itchy & Scratchy cartoons in The Simpsons has chimneys and dark dense smoke coming out of them, as does every factory in The Simpsons, regardless what it’s for. The “factories” in the “The Cloud Factories” seem to intentionally or not (but can this really be unintentional?) transmit the idea of dirt we associate at a reptilian level with “factory”. Dirt. Pollution. Guilt by association. Then — the title of the article, the first of two so far, drops the subtle imagery: “Power, Pollution and the Internet.” Strangely enough, beyond the title the word “pollution” appears exactly once in the entire article.

Pollution and the Internet. How could one not react to that? What I wrote a week ago was pure reaction, if nothing else to the reactionary tone of the article, but by now I have accumulated enough in my head to maybe add something else to this topic, which, perhaps predictably, has a bit less to do with the contents of the article itself (not that that topic is exhausted by any means) but on what is one possible way to look at its main thrust through the lens of discourse on technology nowadays, how we use metaphors and analogies to convey something that we haven’t yet internalized, and the factors at play in sustaining a reasonable and reasonably deep conversation in an environment that doesn’t lend itself to that. And if all of this in retrospect looks obvious, consider this the admittedly convoluted way in which I am creating a reminder, a mental note: something to pay more attention to.

On to it, then.

ACTION REACTION RETRACTION

Action — argument (paraphrasing, summarizing): “That which powers our online services and more generally the Internet is really a hidden pollution machine run by people fearful of reducing waste, even though the means to do so are readily available.”

Reaction — counterargument (now really summarizing): “Not true.”

That the argument isn’t true may be indeed true, and yet to not just agree with the counterargument because, for example, you respect whoever made it but to understand it requires a degree of experience and training and knowledge that is well beyond what most people could get to because, quite simply, they have their own jobs and lives. Indeed, if it’s not your job and it’s not your life (and for most of the people for whom this is a job, it’s also our life), you really shouldn’t bother. The modern world, and to some degree the very basis of our progress is that we use things that we can’t build, and in many cases can’t even understand. We travel by plane even though many people have no idea how it works, let alone are able to build one.

And that is just fine.

We trust the plane, though, don’t we? Well, now we do, but 150 years ago the thought that you could pack tons and tons of baggage and instruments and hundreds of people into a tin can and by pushing air at unimaginable speed through smaller tin cans attached to the larger tin can with bolts you would get the thing to fly was unpopular indeed.

Bear with me for a minute here. I’m getting somewhere. Promise.

As I was writing a week ago I was typing frantically and in the process of switching windows I entered “action reaction retraction” into Google, and the last result visible before I had to scroll said “Robert H. Goddard. The New York Times.” which seemed intriguing enough, and following there were notes on a retraction that seemed almost too appropriate. Really? was the thought, so I went to the Times archives and found the quote, but in the process lost the bizarre way in which I stumbled on to it. I spent almost an hour yesterday, I kid you not, going through the browser’s history to see what I’d done, and I still can’t remember why I was typing that except to think that I must have read this before, and further googling just for the quote shows that it’s been mentioned a few times in the last several years. Sarah Lacy included the quote in her followup, along with her own thoughts regarding an earlier Times story on Tesla motors which shows if not a pattern at least some concordance of mistakes all going in the same direction, or misdirection.

The quote was a retraction from the Times in which it acknowledges:

“Further investigation and experimentation have confirmed the findings of Isaac Newton in the 17th century and it is now definitely established that a rocket can function in a vacuum as well as in an atmosphere. The Times regrets the error.”

This was triggered by Apollo 11’s flight, when, one presumes, a 50-year-old takedown of rocket pioneer Robert Goddard on the very pages of the Times might have come to their attention:

“That Professor Goddard, with his ‘chair’ in Clark College and the countenancing of the Smithsonian Institution, does not know the relation of action to reaction, and the need to have something better than a vacuum against which to react — to say that would be absurd. Of course he only seems to lack the knowledge ladled out daily in high schools.”

The Times regrets the error. This reminded me of what we could call the case of Catholic Church v. Galileo. At least the Vatican actually apologized to Galileo directly, although in fairness to the Times, it took the Vatican closer to 400 years to get to that point.

The reason I bring up the quote again is that there’s a certain tone of mischief detectable in it, since no one can possibly believe that they are seriously a) realizing just now that rockets actually work in a vacuum and b) that the way to correct for this is to say that “this confirms the findings of Isaac Newton.” Points for whoever wrote it: it was funny.

And just to be clear: this isn’t about giving a pass to the Times, but to try to figure out why this seems to be a recurring problem from which the Times seems far from exempt, even when we may be inclined to think they are exempt from it.

The question is, then, why would they, the nebulous they but that nevertheless is actually people, talented as they may be, would have originally thought that trashing Goddard, someone with enough credentials to presumably give him the benefit of the doubt at the very least, was a good idea?

Perhaps because in doing that they were reflecting, ahem, the times — the prevailing sense of what was or wasn’t possible in the age. The “truth” as they saw it, because truth and facts are two different things. To top it off, in this particular case a giant rocket traveling at some 11,000 meters per second was, as an undeniable fact, still very much in the future, but when the rocket was actually up there, actually carrying three people and countless gizmos and measuring devices and chemicals of all kinds, you didn’t have to know anything about physics to realize that there was something to this seemingly crazy idea of rockets in space after all.

Back to trusting airplanes at last: We trust the plane because we see it. We feel, down to our bones, the effort of the engines as it takes off and lands. If someone started to argue that the typical turbine was somewhat wasteful, I don’t think I’d be alone in thinking Well, while I’m inside the plane and on the air, I’d prefer a little waste to not being, you know, alive.

So is there something to the idea that, in the popular imagination, not seeing is disbelieving, to invert the well-known dictum?

More importantly, given the complexity and sheer scale of the systems involved in running the Internet, what would it take to “see” when what we’re talking about can’t, ever, actually be seen?

…AND YOU ALWAYS FEAR… WHAT YOU DON’T UNDERSTAND…

That’s a line from Batman Begins uttered by Mafia mob boss Carmine Falcone while he is explaining to a young Bruce Wayne why he should just stop acting all flustered about crime and go home. It’s a critical line not only in the film but in the overall story arc of the trilogy, since within it we find Bruce Wayne’s drive to become Batman. Bruce agrees with Falcone’s thesis but not his solution, decides to understand, disappears into the underworld, then returns, seven years later, as Batman.

Understanding — not fearing — takes knowledge, and knowledge takes a long time and effort to develop.

Convincing people that flying rockets in space “only” required that we actually fly a rocket in space. What would be the equivalent for getting people to accept that how data centers work is not some perennial waste, where secret gerbils run mindlessly within wheels, most of the time doing nothing at all, wasting energy and in the process laying waste to the planet as well? Well, one way would surely be to getting everyone to spend the equivalent of Wayne’s “seven years in the underworld” which in this case would be not only getting a degree in computer science but spend a good amount of time down in the trenches, seeing firsthand how these things are actually run.

That this is an impractical solution, since we can’t have the whole planet get a CS degree or work in a data center, is obvious. It leaves us with the alternative of using analogies and metaphors to express what people still haven’t internalized, and probably will never be able to internalize, in the way that they have the concept of a rocket or an airplane. Before planes flew, the idea of them also had to be wrapped in analogies and metaphors, usually involving birds. The concept of a factory would have undoubtedly required some heavy analogies to be explained to people in, say, the 16th century. We grasp at something that is known to make the unknown intelligible.

The analogies we choose matter, however. A lot. Which is why I keep talking about planes not factories. A modern commercial jet is a much more apt analogy for the type of “waste” involved in running a modern data center.

There is waste and pollution involved in running a jet, as anyone can plainly see. Sometimes the waste is obvious (empty seats), sometimes it’s not (unnecessary circuitry), but generally people don’t doubt that the good people at Boeing et. al. are always doing their damnedest to make the plane as efficient, safe, and effective as possible. The same is true of Internet infrastructure.

WHERE WE FINALLY GET BACK TO SANTA CLAUS CONQUERING MARTIANS

You may or may not agree with the plane analogy, there may be better ones, there are more things to discuss and there certainly is a need for us in the industry to engage more broadly and try to explain what’s going on as long as everyone in the world doesn’t have a CS degree (a man can dream).

So for all the faults I could find with the article, I think it was good that it triggered the conversation, and herein lies our second conundrum.

This “conversation” — it will require effort to be carried out.

A brief detour: reading Days of Rage a commentary in the latest issue of The New Yorker, which references Santa Claus Conquers The Martians while talking about the “Muslim Rage” of recent days over a YouTube video no one had actually seen, certainly not before the protests. I agree with a lot of the article, except on one point:

“The uproar over “Innocence of Muslims” matters not because of the deep pathologies it has supposedly laid bare but because of the way the film went viral.”

Psy and Gangnam Style was viral. This video wasn’t. If anything, from what we know, it seems to be quite the oppositeof viral, since apparently it was simply an excuse used by people in power to rile up the unhappy (there’s that word again) masses so they could have something to do: “Angry? Unemployed? Bored? Feel you have no future? Here, go burn an embassy.” And how irrationally angry you have to be to somehow find that looting and burning and killing either solves a problem or makes up for anything or is even, just, a remotely justified way to react. How displaced you have to be from yourself and disconnected to what surrounds you. I can hypothesize, only. At points in my life I’ve had little or no money but never felt in a way that would ever lead me to react in that way. Not that this is about money, I know, it’s just one of the factors (probably), but one that I can try to relate through. But I digress.

SCCTM is indeed an actual movie and the reason I bring it up is that I had seen it years ago in an MST3K episode, and when remembering that it occurred to me that what happened in the Middle East was a more, perhaps the most, extreme version of a pervasive phenomenon, that of reacting to what our perception is of something rather than to the thing itself.

Mind you, this isn’t one of those “things were better in my time” type of arguments. While there was a time decades ago when in-depth roundtables in media were more common fare, this happened in an environment in which the amount of raw data to process was far, far less than it is now. We are overwhelmed by data but lacking in information. This isn’t a matter of access to technology, either. I’d bet a lot of the people doing the burning and killing in Benghazi had cellphones. We all do.

This, deep in the weeds of this post (essay?), is what triggered the topic in my head. The end of the chain of associations: that what we’re often doing these days to handle all the information that we’re exposed to would be tantamount to MST3K dispensing with the actual viewing of the movie and simply skipping to the part where we make fun of it. It wouldn’t be the same, would it? Context is critical, but we react in soundbites and generate storms of controversy over a few words which can’t possibly have context attached, because there’s simply no space for it, anywhere.

Twitter and to some degree Facebook are often blamed, unfairly I think, with a supposed devolution of our society into people trapping their thoughts into contextless cages 140-characters in size. I don’t think there’s any question, though, that we humans are and have always been lazy if we can get away with it, and that the deluge of information leave us with little time to reflect on it, so the mind recoils and defends itself with quips and short bursts, and Twitter (and Facebook) are a good mechanism for that. It just so happens that this constant jumping around topics superficially is both a) effective as a dopamine release mechanism –read: addictive– and b) the perfect way of thinking of yourself as informed and on top of everything and yet truly involved in nothing. Why isn’t Twitter or Facebook to blame, then. Let me give you a Twitterless example: sad advertisement on TV, people starving, a catastrophe somewhere. Text a number and give $3. Done. Back to watching Jersey Shore, or 60 minutes, or whatever.

Twitter, Facebook, all of them, are not the proximate cause. They are an effect. A reaction.

The environment we live in has fundamentally changed because there is readily available, quite simply, more data about everything, a large part of which is a barrage of trivia and gossip — which is to be expected since they are, ahem, trivial to generate. If Lindsay Lohan having a traffic accident is enough to generate massive news coverage and the cascade of reaction that follows, topics that are deeper and more complex and are more difficult to grasp will find it hard to compete.

It’s something new, or relatively new in historical terms, and I don’t think we know how to handle this deluge yet. We are drinking from a seemingly limitless flood of information but we haven’t yet figured out how to close the faucet every once in a while. We don’t necessarily drown in it but this flood that is constantly rushing around us leaves us with no time to reflect on any one point.

Information overload! Pfft. This isn’t a new idea! I bring it up not only because I think that we are increasingly using (creating) media that is suited to how we are trying to deal with it, and the edifice we construct with all of it is not well-optimized to transmit complex ideas (this, also, is not at all original), and so it seems critical that we have to work hard at finding the right metaphors and analogies, the right tools to talk about how the machinery of the Internet works. Tools and machinery, here, somewhat ironically encapsulating the point.

AND NOW FOR THE SURPRISINGLY SUCCINCT CONCLUSION

Analogies matter, metaphors matter, and we need to find better ones to talk about what the Internet is (for example, a “global village” it is not, and this term has luckily fallen by the wayside, but the many reasons why will have to wait for another time). We also have to contend with a shifting media environment in which a conversation like this can get all too easily lost in the noise, not because, as a cynical interpretation would have it, people only care about Snooki or the Kardashians or whatever, but because until we figure out how to live and engage with complexity when soaking in data there will only only surface and precious little depth.

And if there’s an additional meta-point to “Power, Pollution and the Internet,” something else that is important beyond the specifics in the article, it is that we as an industry have left a void that can be filled with anything, and if we don’t engage and try make what we do more comprehensible for everyone who, rightly, doesn’t have the time to understand it because they’re busy running the rest of the world, then we in the industry have no one to answer to for it but ourselves.

Part 2 of a series (Part 1Part 3)

Follow

Get every new post delivered to your Inbox.

Join 383 other followers

%d bloggers like this: