diego's weblog

there and back again

Monthly Archives: September 2012

santa claus conquers the martians

Part 2 of a series (Part 1Part 3)

PRELUDE

I’ve had a busy week, and have been trying to sit down and put together a followup to my response to the NYT’s article on data centers.

I write the title, and I soon as I do, my mind goes blank. I read the title again. What the hell was I thinking? I am looking at the screen, white space extends below the blinking cursor, mirrored by something somehow stuck in my head, alternating on/off, rumbling lowly like an idling engine: I swear I had a point.

So naturally I start to think that this, perhaps, should be the new title. Which, in the expected recursion path that would follow naturally ends up in another meta-commentary paragraph (also with a simile close to its ending), which I decide not to write. Recursion upwards, probably to conform with an implicit image of happiness we may or may not feel (or is in this case is really quite unwarranted and even more, even worse: unnecessary) but we should generally imply anyway, because these days if you’re not explicitly happy something must be wrong, and therefore it must be fixed. Neutral has become a bad state to be in, apparently, long after being “with us or against us” became a common way to think about nearly everything. No, recursion has no direction except, perhaps, into itself, but it now occurs to me that years of looking at function call stacks have trained me (hopelessly comes to mind, but that’s also not happy) to think of recursion as up or down, rather than, say, horizontally from right to left.

Fascinating, I know.

– oOo –

I will eventually get to Santa Claus and the Martians, but for the moment, back to the article.

The series was titled “The Cloud Factories”, and right there it broadcast ever-so-subtly that it was to be something intended to get worked up about.

“Factory” can mean “the seat of some kind of production” but in this case the weight of the word is in the manufacturing angle. This doesn’t quite feel right, though. A factory is where things are built, sequentially, or at least mostly sequentially, and a cloud is anything but built, and the process is anything but sequential. A cloud emerges, and if we switch to the definite article and the proper noun with all its implications and uppercaseness, it’s also true that The Cloud is an emergent phenomenon. Metaphors are often misapplied, can be incorrect, but it’s not that often that a metaphor involving an overloaded term (“cloud”) is both misapplied and incorrect in the exact same way for nearly all the meanings of the term. This takes some skill.

So, yeah, the point of the title of the series was not to be accurate as an analogy, but to evoke. Specifically, an image. Much like the factory in which they make Itchy & Scratchy cartoons in The Simpsons has chimneys and dark dense smoke coming out of them, as does every factory in The Simpsons, regardless what it’s for. The “factories” in the “The Cloud Factories” seem to intentionally or not (but can this really be unintentional?) transmit the idea of dirt we associate at a reptilian level with “factory”. Dirt. Pollution. Guilt by association. Then — the title of the article, the first of two so far, drops the subtle imagery: “Power, Pollution and the Internet.” Strangely enough, beyond the title the word “pollution” appears exactly once in the entire article.

Pollution and the Internet. How could one not react to that? What I wrote a week ago was pure reaction, if nothing else to the reactionary tone of the article, but by now I have accumulated enough in my head to maybe add something else to this topic, which, perhaps predictably, has a bit less to do with the contents of the article itself (not that that topic is exhausted by any means) but on what is one possible way to look at its main thrust through the lens of discourse on technology nowadays, how we use metaphors and analogies to convey something that we haven’t yet internalized, and the factors at play in sustaining a reasonable and reasonably deep conversation in an environment that doesn’t lend itself to that. And if all of this in retrospect looks obvious, consider this the admittedly convoluted way in which I am creating a reminder, a mental note: something to pay more attention to.

On to it, then.

ACTION REACTION RETRACTION

Action — argument (paraphrasing, summarizing): “That which powers our online services and more generally the Internet is really a hidden pollution machine run by people fearful of reducing waste, even though the means to do so are readily available.”

Reaction — counterargument (now really summarizing): “Not true.”

That the argument isn’t true may be indeed true, and yet to not just agree with the counterargument because, for example, you respect whoever made it but to understand it requires a degree of experience and training and knowledge that is well beyond what most people could get to because, quite simply, they have their own jobs and lives. Indeed, if it’s not your job and it’s not your life (and for most of the people for whom this is a job, it’s also our life), you really shouldn’t bother. The modern world, and to some degree the very basis of our progress is that we use things that we can’t build, and in many cases can’t even understand. We travel by plane even though many people have no idea how it works, let alone are able to build one.

And that is just fine.

We trust the plane, though, don’t we? Well, now we do, but 150 years ago the thought that you could pack tons and tons of baggage and instruments and hundreds of people into a tin can and by pushing air at unimaginable speed through smaller tin cans attached to the larger tin can with bolts you would get the thing to fly was unpopular indeed.

Bear with me for a minute here. I’m getting somewhere. Promise.

As I was writing a week ago I was typing frantically and in the process of switching windows I entered “action reaction retraction” into Google, and the last result visible before I had to scroll said “Robert H. Goddard. The New York Times.” which seemed intriguing enough, and following there were notes on a retraction that seemed almost too appropriate. Really? was the thought, so I went to the Times archives and found the quote, but in the process lost the bizarre way in which I stumbled on to it. I spent almost an hour yesterday, I kid you not, going through the browser’s history to see what I’d done, and I still can’t remember why I was typing that except to think that I must have read this before, and further googling just for the quote shows that it’s been mentioned a few times in the last several years. Sarah Lacy included the quote in her followup, along with her own thoughts regarding an earlier Times story on Tesla motors which shows if not a pattern at least some concordance of mistakes all going in the same direction, or misdirection.

The quote was a retraction from the Times in which it acknowledges:

“Further investigation and experimentation have confirmed the findings of Isaac Newton in the 17th century and it is now definitely established that a rocket can function in a vacuum as well as in an atmosphere. The Times regrets the error.”

This was triggered by Apollo 11′s flight, when, one presumes, a 50-year-old takedown of rocket pioneer Robert Goddard on the very pages of the Times might have come to their attention:

“That Professor Goddard, with his ‘chair’ in Clark College and the countenancing of the Smithsonian Institution, does not know the relation of action to reaction, and the need to have something better than a vacuum against which to react — to say that would be absurd. Of course he only seems to lack the knowledge ladled out daily in high schools.”

The Times regrets the error. This reminded me of what we could call the case of Catholic Church v. Galileo. At least the Vatican actually apologized to Galileo directly, although in fairness to the Times, it took the Vatican closer to 400 years to get to that point.

The reason I bring up the quote again is that there’s a certain tone of mischief detectable in it, since no one can possibly believe that they are seriously a) realizing just now that rockets actually work in a vacuum and b) that the way to correct for this is to say that “this confirms the findings of Isaac Newton.” Points for whoever wrote it: it was funny.

And just to be clear: this isn’t about giving a pass to the Times, but to try to figure out why this seems to be a recurring problem from which the Times seems far from exempt, even when we may be inclined to think they are exempt from it.

The question is, then, why would they, the nebulous they but that nevertheless is actually people, talented as they may be, would have originally thought that trashing Goddard, someone with enough credentials to presumably give him the benefit of the doubt at the very least, was a good idea?

Perhaps because in doing that they were reflecting, ahem, the times — the prevailing sense of what was or wasn’t possible in the age. The “truth” as they saw it, because truth and facts are two different things. To top it off, in this particular case a giant rocket traveling at some 11,000 meters per second was, as an undeniable fact, still very much in the future, but when the rocket was actually up there, actually carrying three people and countless gizmos and measuring devices and chemicals of all kinds, you didn’t have to know anything about physics to realize that there was something to this seemingly crazy idea of rockets in space after all.

Back to trusting airplanes at last: We trust the plane because we see it. We feel, down to our bones, the effort of the engines as it takes off and lands. If someone started to argue that the typical turbine was somewhat wasteful, I don’t think I’d be alone in thinking Well, while I’m inside the plane and on the air, I’d prefer a little waste to not being, you know, alive.

So is there something to the idea that, in the popular imagination, not seeing is disbelieving, to invert the well-known dictum?

More importantly, given the complexity and sheer scale of the systems involved in running the Internet, what would it take to “see” when what we’re talking about can’t, ever, actually be seen?

…AND YOU ALWAYS FEAR… WHAT YOU DON’T UNDERSTAND…

That’s a line from Batman Begins uttered by Mafia mob boss Carmine Falcone while he is explaining to a young Bruce Wayne why he should just stop acting all flustered about crime and go home. It’s a critical line not only in the film but in the overall story arc of the trilogy, since within it we find Bruce Wayne’s drive to become Batman. Bruce agrees with Falcone’s thesis but not his solution, decides to understand, disappears into the underworld, then returns, seven years later, as Batman.

Understanding — not fearing — takes knowledge, and knowledge takes a long time and effort to develop.

Convincing people that flying rockets in space “only” required that we actually fly a rocket in space. What would be the equivalent for getting people to accept that how data centers work is not some perennial waste, where secret gerbils run mindlessly within wheels, most of the time doing nothing at all, wasting energy and in the process laying waste to the planet as well? Well, one way would surely be to getting everyone to spend the equivalent of Wayne’s “seven years in the underworld” which in this case would be not only getting a degree in computer science but spend a good amount of time down in the trenches, seeing firsthand how these things are actually run.

That this is an impractical solution, since we can’t have the whole planet get a CS degree or work in a data center, is obvious. It leaves us with the alternative of using analogies and metaphors to express what people still haven’t internalized, and probably will never be able to internalize, in the way that they have the concept of a rocket or an airplane. Before planes flew, the idea of them also had to be wrapped in analogies and metaphors, usually involving birds. The concept of a factory would have undoubtedly required some heavy analogies to be explained to people in, say, the 16th century. We grasp at something that is known to make the unknown intelligible.

The analogies we choose matter, however. A lot. Which is why I keep talking about planes not factories. A modern commercial jet is a much more apt analogy for the type of “waste” involved in running a modern data center.

There is waste and pollution involved in running a jet, as anyone can plainly see. Sometimes the waste is obvious (empty seats), sometimes it’s not (unnecessary circuitry), but generally people don’t doubt that the good people at Boeing et. al. are always doing their damnedest to make the plane as efficient, safe, and effective as possible. The same is true of Internet infrastructure.

WHERE WE FINALLY GET BACK TO SANTA CLAUS CONQUERING MARTIANS

You may or may not agree with the plane analogy, there may be better ones, there are more things to discuss and there certainly is a need for us in the industry to engage more broadly and try to explain what’s going on as long as everyone in the world doesn’t have a CS degree (a man can dream).

So for all the faults I could find with the article, I think it was good that it triggered the conversation, and herein lies our second conundrum.

This “conversation” — it will require effort to be carried out.

A brief detour: reading Days of Rage a commentary in the latest issue of The New Yorker, which references Santa Claus Conquers The Martians while talking about the “Muslim Rage” of recent days over a YouTube video no one had actually seen, certainly not before the protests. I agree with a lot of the article, except on one point:

“The uproar over “Innocence of Muslims” matters not because of the deep pathologies it has supposedly laid bare but because of the way the film went viral.”

Psy and Gangnam Style was viral. This video wasn’t. If anything, from what we know, it seems to be quite the oppositeof viral, since apparently it was simply an excuse used by people in power to rile up the unhappy (there’s that word again) masses so they could have something to do: “Angry? Unemployed? Bored? Feel you have no future? Here, go burn an embassy.” And how irrationally angry you have to be to somehow find that looting and burning and killing either solves a problem or makes up for anything or is even, just, a remotely justified way to react. How displaced you have to be from yourself and disconnected to what surrounds you. I can hypothesize, only. At points in my life I’ve had little or no money but never felt in a way that would ever lead me to react in that way. Not that this is about money, I know, it’s just one of the factors (probably), but one that I can try to relate through. But I digress.

SCCTM is indeed an actual movie and the reason I bring it up is that I had seen it years ago in an MST3K episode, and when remembering that it occurred to me that what happened in the Middle East was a more, perhaps the most, extreme version of a pervasive phenomenon, that of reacting to what our perception is of something rather than to the thing itself.

Mind you, this isn’t one of those “things were better in my time” type of arguments. While there was a time decades ago when in-depth roundtables in media were more common fare, this happened in an environment in which the amount of raw data to process was far, far less than it is now. We are overwhelmed by data but lacking in information. This isn’t a matter of access to technology, either. I’d bet a lot of the people doing the burning and killing in Benghazi had cellphones. We all do.

This, deep in the weeds of this post (essay?), is what triggered the topic in my head. The end of the chain of associations: that what we’re often doing these days to handle all the information that we’re exposed to would be tantamount to MST3K dispensing with the actual viewing of the movie and simply skipping to the part where we make fun of it. It wouldn’t be the same, would it? Context is critical, but we react in soundbites and generate storms of controversy over a few words which can’t possibly have context attached, because there’s simply no space for it, anywhere.

Twitter and to some degree Facebook are often blamed, unfairly I think, with a supposed devolution of our society into people trapping their thoughts into contextless cages 140-characters in size. I don’t think there’s any question, though, that we humans are and have always been lazy if we can get away with it, and that the deluge of information leave us with little time to reflect on it, so the mind recoils and defends itself with quips and short bursts, and Twitter (and Facebook) are a good mechanism for that. It just so happens that this constant jumping around topics superficially is both a) effective as a dopamine release mechanism –read: addictive– and b) the perfect way of thinking of yourself as informed and on top of everything and yet truly involved in nothing. Why isn’t Twitter or Facebook to blame, then. Let me give you a Twitterless example: sad advertisement on TV, people starving, a catastrophe somewhere. Text a number and give $3. Done. Back to watching Jersey Shore, or 60 minutes, or whatever.

Twitter, Facebook, all of them, are not the proximate cause. They are an effect. A reaction.

The environment we live in has fundamentally changed because there is readily available, quite simply, more data about everything, a large part of which is a barrage of trivia and gossip — which is to be expected since they are, ahem, trivial to generate. If Lindsay Lohan having a traffic accident is enough to generate massive news coverage and the cascade of reaction that follows, topics that are deeper and more complex and are more difficult to grasp will find it hard to compete.

It’s something new, or relatively new in historical terms, and I don’t think we know how to handle this deluge yet. We are drinking from a seemingly limitless flood of information but we haven’t yet figured out how to close the faucet every once in a while. We don’t necessarily drown in it but this flood that is constantly rushing around us leaves us with no time to reflect on any one point.

Information overload! Pfft. This isn’t a new idea! I bring it up not only because I think that we are increasingly using (creating) media that is suited to how we are trying to deal with it, and the edifice we construct with all of it is not well-optimized to transmit complex ideas (this, also, is not at all original), and so it seems critical that we have to work hard at finding the right metaphors and analogies, the right tools to talk about how the machinery of the Internet works. Tools and machinery, here, somewhat ironically encapsulating the point.

AND NOW FOR THE SURPRISINGLY SUCCINCT CONCLUSION

Analogies matter, metaphors matter, and we need to find better ones to talk about what the Internet is (for example, a “global village” it is not, and this term has luckily fallen by the wayside, but the many reasons why will have to wait for another time). We also have to contend with a shifting media environment in which a conversation like this can get all too easily lost in the noise, not because, as a cynical interpretation would have it, people only care about Snooki or the Kardashians or whatever, but because until we figure out how to live and engage with complexity when soaking in data there will only only surface and precious little depth.

And if there’s an additional meta-point to “Power, Pollution and the Internet,” something else that is important beyond the specifics in the article, it is that we as an industry have left a void that can be filled with anything, and if we don’t engage and try make what we do more comprehensible for everyone who, rightly, doesn’t have the time to understand it because they’re busy running the rest of the world, then we in the industry have no one to answer to for it but ourselves.

Part 2 of a series (Part 1Part 3)

a shocking new way to get google maps on iOS 6

  • Step 1: Visit maps.google.com.
  • Step 2: (optional) save shortcut to homescreen.

Hm.

PS: Yes, defaults matter, but the native app was never that much better than the web app.

a lot of lead bullets: a response to the new york times article on data center efficiency

Part 1 of a series (Part 2, Part 3)

Note: This is 5,000 word post (!) in response to a 5,000-word article, since I thought it necessary to go beyond the usual “that’s wrong”. A detailed argument requires a detailed counter-argument, but, still, apologies for the length. 

As I was reading this New York Times article on data centers and power use, there was mainly one word stretching, doppler-like through my head: “Nooooooo!”

Not because the article exposed some secret that everyone that’s worked on websites at scale knows and this intrepid reporter was blowing the lid on our quasi-masonic-illuminati conspiracy. Not because there was information in it that was in any way shocking.

The reason I was yelling in my head was that I could see, clear as day, how people who don’t know what’s involved in running large scale websites would take this article. Just look at the comments section.

The assertions made in it essentially paint our engineers and operations people as a bunch of idiots who are putting together rows and rows of boxes on data centers and not caring what this costs to their businesses, nay, to the planet.

And nothing could be further from the truth.

There is one thing that the article covers that is absolutely true: data centers consume a hell of a lot of power. Sadly, the rest is a mix of half-guesses, contradictions, and flat-out incorrect information that creates all the wrong impressions, misinforms, and misrepresents the efforts and challenges that the people running these systems face everyday. In the process, the article manages to talk to precious few people that are really in a position to know and explain what’s going on. In fact, when I say precious few, I mean one: Jeff Rothschild, from Facebook, and instead of asking him questions on data centers the article just uses one amusing anecdote from Facebook’s early days that makes eng/ops look like a bunch of monkeys running around with fans.

This isn’t just an incredibly inaccurate representation of the dedication and hard work of eng/ops everywhere in the computer industry, I know for a fact it’s also inaccurate in what regards to Facebook itself. I imagine Facebook engineers (and that of any other website really) reading this article, thinking about the times they’ve been woken up in the middle of the night to solve problems that no one has ever faced before, for which no one has trained them, because no university course and no amount of research prepares you for the challenges of running a service at high scale, and having to solve all that as fast as possible, regardless of whether it’s about making sure that someone can run their business, do their taxes, or that a kid halfway around the world can upload their video of a cat playing the piano.

Before I continue, let me say that, even if I am (clearly) a bit miffed, I respect the efforts of the reporter, even with the inadequate sources that the story quotes, and there is an important story here but it’s not the one he focused on. The question is not that there’s inefficiency, or rather, under-utilization of power. There’s some but not as much (the main figure the article quotes is that data centers, or DCs for short run at 6-12% utilization is just completely made up by consultants and I can’t possibly imagine how they arrived at it).

The question is why.

Sure, there’s some inefficiency. But why? There are many reasons, but before I get to them, let me spend some time on the problems in the article itself beyond this central issue.

The problems in the article

Let me go through the biggest issues in the article and debunk them. To begin:

Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.

First off, an “average,” as any statistician will tell you, is a fairly meaningless number if you don’t include other values of the population (starting with the standard deviation). Not to mention that this kind of “explosive” claim should be backed up with a description of how the study was made. The only thing mentioned about the methodology is that they “sampled about 20,000 servers in about 70 large data centers spanning the commercial gamut: drug companies, military contractors, banks, media companies and government agencies.” Here’s the thing: Google alone has more than a million servers. Facebook, too, probably. Amazon, as well. They all do wildly different things with their servers, so extrapolating from “drug companies, military contractors, banks, media companies, and government agencies” to Google, or Facebook, or Amazon, is just not possible on the basis of just 20,000 servers on 70 data centers.

Not possible, that’s right. It would have been impossible (and people that know me know that I don’t use this word lightly) for McKinsey & Co. to do even a remotely accurate analysis of data center usage for the industry to create any kind of meaningful “average”. Why? Not only because gathering this data and analyzing it would have required many of the top minds in data center scaling (and they are not working at McKinsey), not only because Google, Facebook, Amazon, Apple, would have not given McKinsey this information, not only because the information, even if it was given to McKinsey, would have been in wildly different scales and contexts, which is an important point.

Even if you get past all of these seemingly insurmountable problems through an act of sheer magic, you end up with another problem altogether: server power is not just about “performing computations”. If you want to simplify a bit, there’s at least four main axis you could consider for scaling: computation proper (e.g. adding 2+2), storage (e.g. saving “4″ to disk, or reading it from disk), networking (e.g. sending the “4″ from one computer to the next) and memory usage (e.g. storing the “4″ in RAM). This is an over-simplification because today you could, for example, split up “storage” into “flash-based” and “magnetic” storage since they are so different in their characteristics and power consumption, just like we separate RAM from persistent storage, but we’ll leave it at four. Anyway, these four parameters lead to different load profiles for different systems.

The load profile of a system can tell you its primary function in abstract terms. Machines in data centers are not used homogeneously. Clusters of them may be primarily used for computation, other clusters for storage, others have a mixed load. For example, a SQL database cluster will generally use all four heavily, while a cluster that serves as a memory cache would only use RAM and network heavily. As an aside, network is important in terms of power consumption not in and of itself, since running a network card is a rounding error in terms of power consumption, but because to run a big network infrastructure requires switches in heavily redundant configurations that actually do count, not to mention the fact that a bigger network equals more complexity that has to be managed, but we’ll get to that in more detail later.

I don’t doubt that McKinsey got a lot of numbers from a lot of companies and then mashed them together to create an average. I’m just saying that a) it’s impossible to have the numbers measure the exact same thing across so many different companies that are not coordinating with each other, and b) that the average is therefore of limited value since the load profile of all of these services is so wildly different that if you don’t have accurate data for, say, Google, your result is meaningless.

1. Right from the start, one of the primary legs on which the article stands on is either incorrect, or meaningless, or both.

Moving on.

A server is a sort of bulked-up desktop computer, minus a screen and keyboard, that contains chips to process data.

No, that’s not what a server is. It is not a “bulked-up desktop”. In fact, the vast majority of servers these days are probably as powerful as a typical laptop, minus battery, display, graphics card and such. And in many cases the physical “server” doesn’t even exist since everyone doing web at scale makes extensive use of virtualization, either by virtualizing at the OS level and running multiple virtual machines (in which case, yes, perhaps that one machine is bigger than a desktop, but it runs several actual server processes in it) or distributing the processing and storage at a more fine-grained level (MapReduce would be an example of this). There’s no longer a 1-1 correlation between “server” and “machine,” and, increasingly, “servers” are being replaced by services.

The reason this seemingly minor thing matters is that it creates an image that a datacenter is readily comprehensible. Just a lot of “bulked up desktops.” I understand the allure of this analogy, but it’s not true, and it creates the perception that scaling up or down is just a matter of adding or removing these boxes. Sounds easy, right? Just use them more efficiently!

2. Servers are not “bulked up desktops” and this is critical because infrastructure isn’t just a lot of bricks that you pile on top of each other. There’s no neat dividing line. You can’t just say “use less servers” to increase efficiency, which is what the incorrect analogy leads to.

Next!

“This is an industry dirty secret, and no one wants to be the first to say mea culpa,” said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway.”

Say what? That infrastructure is inefficient is a dirty secret? First off, inefficiency is not a secret, and it’s not “dirty.” Maybe Google or Facebook don’t publish papers talking about their utilization, but the distance between not shouting this from the rooftops and this being a “dirty secret” is indeed long.

This statement, from an anonymous source no less, matters because it creates the sense in the article that the industry is operating in the shadows, trying to hide a problem that, “if only people knew” would create a huge issue. Nothing could be further from the truth. There are hundreds of conferences and gatherings a year, open to the public, where the people that run services get together to discuss these problems, solutions, and advances. Everyone I know talks about how to make things better, and, without divulging company secrets, exchanges information on how to solve issues that we face. There’s ACM and IEEE publications (to name just two organizations), papers, and conferences. The reason it appears hidden is that it’s just one of the many areas that only interest the people involved, primarily nerds. It’s arcane, perhaps, but not hidden.

3. This isn’t any kind of “industry dirty secret.” This statement only helps in making this appear to be part of some conspiracy which doesn’t exist and papers over the real issues by shifting attention to the supposed people who keep this “dirty secret.” That is, it seems to identify a group of people at which we can point our proverbial pitchforks, and nothing else.

Next!

Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centers contain banks of huge, spinning flywheels or thousands of lead-acid batteries — many of them similar to automobile batteries — to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers.

“It’s a waste,” said Dennis P. Symanski, a senior researcher at the Electric Power Research Institute, a nonprofit industry group. “It’s too many insurance policies.”

The first paragraph in this quote seems to imply that batteries (“lead-acid batteries” which sounds more ominous than just “batteries”) and “spinning flywheels” are there because the industry is “not satisfied with running electricity at full throttle”.

To begin — how do you run electricity at less than “full throttle”? Is there some kind of special law of physics that I’m not aware of that lets you use half an electron? If you need 2 amps at 110 volts, that’s what you need. If you need half, you use half. There’s no “full throttle.” A system draws the power it needs, no more, no less, to run its components at a particular point in time. Could you build machines that are more efficient in their use of power? Of course, and people are working on that constantly. Google, Facebook, Amazon, all spend hundreds of millions of dollars a year running data centers, and a big chunk of that is power. It’s a cost center that people are always trying to reduce.

Then there’s the money quote from Mr. Symanski, someone I’ve never heard of, saying that “It’s a waste. It’s too many insurance policies.”

Really? Let’s look at what happens when a data center goes offline. How many the-sky-is-falling articles have been written when significant services of the Internet are affected? A ton, of course, many of them in The New York Times itself. Too many insurance policies? Not true. Managing these systems is incredibly complex. Eng/ops people don’t deploy systems for the fun of it. We do this because it’s required, and, if anything, we do less than we know we should because there’s never enough money or people or time to deploy the right solution, so we make do with what we can and make up for the difference with a lot of hard work.

4. It’s not “too many insurance policies.” Redundant systems in data centers aren’t perfect by any stretch of the imagination, but the article strongly implies that this is being done willfully and flying in the face of evidence that says that’s unnecessary. This is flat-out not true.

These next two paragraphs are just incredible in how they distort information:

A few companies say they are using extensively re-engineered software and cooling systems to decrease wasted power. Among them are Facebook and Google, which also have redesigned their hardware. Still, according to recent disclosures, Google’s data centers consume nearly 300 million watts and Facebook’s about 60 million watts.

It quotes essentially unnamed sources at Facebook and Google (or their PR people) in saying that they are using software and cooling systems to decrease wasted power, and then it goes on to say that they still consume a lot of power, creating the impression that they are not really solving the problem. Here’s the thing though: if you built a more efficient rocket to go to the moon, it would still consume a lot of power. No one would be shocked to learn that, would they?

But there’s more!

Many of these solutions are readily available, but in a risk-averse industry, most companies have been reluctant to make wholesale change, according to industry experts.

This is probably the one paragraph in the article that sent my head spinning. It makes it appear as if there’s “solutions” that everyone knows about but no one uses because we’re a “risk-averse” industry.

What “solutions”? Who is “risk-averse”? Which companies are “reluctant to make wholesale change”? Who are the “industry experts” that assert this is the case?

This last paragraph is quite simply, flat-out factually wrong. There are no “solutions” that are “readily available.” There simply aren’t. The article later quotes some efforts around higher utilization and from companies who are working in the area that imply that if everyone just did whatever these people are doing, everything would be better. And I say “whatever these people are doing” to capture the flavor of the article, since what they are doing is not magic, they just seem to be more efficient at queuing and batching processing. The problem is that not everything can be efficiently batched, certainly not when your usage isn’t dictated by your own schedule but by end users around the world, and each particular use case requires its own solutions. We’re not dealing with one problem but many, but I’ll get to that in a moment.

5. There are no “readily available” solutions to this problem, because there isn’t just one problem to solve. There’s multiple overlapping challenges that are all different and sometimes contradictory factors (e.g. exchange utilization for failover capabilities), and no one is “reluctant to make wholesale change” — these are huge, complex systems that can’t be just replaced for something else.

(And by the way, how ironic is it that an article about “an industry reluctant to make wholesale change” is being run by… a newspaper?)

Finally, just to wrap up since I’ve gone on way longer than I planned on the quotes and I don’t want to reprint the whole article, I want to touch on the contradictions. One giant contradiction in the article is that it talks throughout about how the “industry” (which by the way, is never defined clearly, although I’m assuming we are talking about the computer industry in general, since by now everyone pretty much uses data centers?) is “risk averse” while also covering the sheer scale of the infrastructure that exists. Most of this infrastructure has been built up over the last ten years. Google, Facebook, Amazon, and everyone else all have ramped up, by orders of magnitude, their operations over the last five years. None of these companies even existed 15 years ago!. The social web and mobile have exploded in the last 3 years. There’s simply no way the industry can simultaneously build up this massive infrastructure that sustain exponential growth rates in traffic and usage and be so hugely risk averse. Maybe that will be true in a couple of decades. It isn’t true now, and whatever “expert” the reporter has talked to has not really been involved in running a modern data center. A lot of quotes in the article seem to come from people at “The Uptime Institute,” an organization I’ve never heard about, so maybe for a followup they should talk to the people who are actually running these systems at Facebook, Google, and others.

Nationwide, data centers used about 76 billion kilowatt-hours in 2010, or roughly 2 percent of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research fellow at Stanford University who has been studying data center energy use for more than a decade. DatacenterDynamics, a London-based firm, derived similar figures.

The industry has long argued that computerizing business transactions and everyday tasks like banking and reading library books has the net effect of saving energy and resources. But the paper industry, which some predicted would be replaced by the computer age, consumed 67 billion kilowatt-hours from the grid in 2010, according to Census Bureau figures reviewed by the Electric Power Research Institute for The Times.

This represents another contradiction which spans the whole article: there are many mentions of how it’s impossible to accurately measure how much power is being used, and there’s as many specific numbers thrown around of the power being used. Which is it? Do we know, or don’t we?

But the clincher here is that the “industry” which “has long argued that computerizing business transactions and everyday tasks like banking and reading library books has the net effect of saving energy and resources” used, apparently 76 billion kW-hours 2010, “but” the paper industry used 60 billion kW-hours in 2010. I’m sorry, what? The paper industry? Are we counting Post-Its here? Kitchen paper towels? And assuming these figures are accurate, in what universe can we compare the power used by compute devices as a whole with the power used by paper-making companies and not only pretend that there’s any equivalence, but also imply that the computer industry has failed because the paper industry is still huge? And that’s the implication created by the paragraph, no question about it.

The real issue: why?

So as I said at the beginning, the article goes off in what I can only characterize as sensationalistic attacks on data center technology while avoiding the real question: why?

Why do data centers consume so much power? Why are there multiple redundancies? Why is it that utilization is not 100%?

Power consumption

First — data centers consume a lot of power for the simple reason that we’re doing a lot with them. The article touches upon this very briefly. Massive amounts of data are flowing through these systems, and not just by consumers. The data has to be stored, processed, and transmitted, which requires extremely large footprints and therefore huge power consumption. It’s simple physics.

Contrary to what the article states, data centers have undergone drastic evolution in the last ten years, and continue to do so. There’s an incredible amount of work being done to make data centers and infrastructure work better, be more efficient, cheaper, faster, more reliable, you name it. However this leads to some inefficiency. You can’t replace everything at once. New systems and approaches have to be tested and re-tested, then deployed incrementally. There’s no silver bullet.

There’s also the issue of complexity, evolution of requirements, and differences of usage, all of which also lead to inefficiency, which I’ll come back to in the end.

Redundancies

The article strongly implies, more than once, that unnecessary redundancies, fear of failure, and so forth, are one of the key reasons for inefficiency. This is, as I’ve said above, completely untrue. The redundancies that web services today have is not excessive. They are the best way each company has found to solve the challenges they have faced and continue to face. No doubt if you took any one system within any one company and did a deep analysis you could find elements to optimize, but that doesn’t mean that the solutions are feasible. When you deal with complex systems it’s often the case that unintended consequences are lurking at every turn, and the obvious way to avoid a catastrophic failure to is to have backup systems. Airplanes have multiple redundant systems exactly for this reason.

And before you say that comparing an airplane with web services is a bad analogy, imagine the type of disruption you’d have, worldwide, if Google was down for a day. Or if hundreds of millions of people couldn’t access their email for a day. The planet freaks out when Gmail or Twitter is down even for a couple of hours! Even services that are less obviously critical, like Facebook or Twitter, would generate huge disruptions if they disappeared. Dictators that are constantly trying to shut down access to these services know why: these services, frivolous as they sometimes appear, are part of the fabric of society now and crucial in how it functions and how it communicates. That aside, there’s of course the minor matter that they are run by for-profit companies, and if the service isn’t running they aren’t making any money.

Speaking of money, that’s an important point lost in the article. People don’t just, as a rule, throw tens of millions of dollars at a problem when they can avoid it. You can count on the fact that if there was a way to provide the same reliability with significantly less money, they would do it. A natural thing to do would have been to go to the CFO of any of these companies and ask: “look, I just uncovered this massive inefficiency and waste, why are you wasting money like this?” But that would go against the narrative that these companies are just gripped by fear of change and either they don’t care that they are burning through hundreds of millions of dollars for no reason, or they are so stupid that they don’t even realize it.

Over-capacity
Third — data center utilization is not 100%. This is true. But back to airplanes, it’s also true that on a typical flight there’s empty seats, because you need overcapacity. Similarly, no one runs a large web service without at least 25% or more of spare capacity, and in some cases you need more. Why? Many reasons, but, first, there’s spikes. Usage of services on the web (whether directly through a website or through cellphone apps, or even backend services) is often hard to predict, and in many cases even if predictable it is incredibly variable. There are spikes related to events (e.g. The Olympics) and micro-spikes within those spikes that push the system even further. To use a recent example, everyone talked about how at the conventions a few weeks ago Mitt Romney’s acceptance speech generated something like 15,000 tweets/second while Obama’s speech peaked at around 45,000 tweets/second. What no one is asking is how it is possible that the same system that handled 15,000 tweets/second could also handle 45,000 only a few days later. Twitter didn’t just triple its capacity for that speech only to tear it down the next day, right? The answer is simple: overcapacity, planned and deployed well in advance. And if it didn’t have the capacity ready, what would have happened? Would we have seen an article congratulating Twitter for saving power and not running a capacity surplus? Or would the web have exploded with “Twitter goes down during convention” commentary? It’s not hard to guess.

Another cause for over-provisioning is quite simply fast growth. These systems cost millions of dollars to buy, deploy, test and launch. This is not the kind of thing you can do every day. So you plan the best you can and deploy in tranches, to make sure that you have enough for the next step in growth, which means that at any point in time you are running with more capacity than you need, simply because even though it’s more capacity than you need today it’s going to be less than what you need tomorrow.

There’s also bugs. Code that is deployed sometimes isn’t hyper-efficient or work exactly as intended (what? shocking!). When that happens, having extra capacity helps prevent a meltdowns.

In the process of deploying you need to test. Testing requires you create duplicates of a lot of infrastructure so that you can verify that the site just won’t stop running when you release a new feature. Testing environments are far smaller than production environments, but they can still be sizable.

Another huge area that requires fluctuating (and yet, ever-increasing) capacity is analytics. You need to make sense of all of the data, be able to know when to run ads, and figure out where the bottlenecks are in the system so you can optimize it. That’s right, before you can optimize any system you need to first make it bigger by creating an analytics infrastructure so that you can figure out how to make it smaller.

Then there’s attacks on infrastructure that have to be survived, since you can’t prevent all of them. The article makes absolutely no mention of this, but before you can detect that an attack is happening, you have to be able to withstand it. So any half-decent web infrastructure has to have the ability to handle an attack before it can be neutralized.

Being ready for spikes and growth, testing and deployment, data analysis, etc, all of this requires overcapacity.

If the article was talking about the human immune system, it would have said something like “look at all of those white cells in the body, doing nothing most of the time, what a waste.” But the truth is that they’re there for a reason.

The common thread: complexity

One final point I mentioned I’d get to a couple of times is complexity. It’s the common thread among all of these reasons, and it doesn’t make for nice soundbites. Starting with the issue of load-profiles that I touched on at the beginning, all the way to the problem created by constant (and sometimes extremely fast growth) at the end, we are facing challenges without precedent, and the solutions are often imperfect.

On top of that, requirements change quickly, today’s popular feature is tomorrow’s unused piece of code, and even within what’s popular you can have usage spikes that are impossible to plan for and that therefore you have to solve on the fly, leading to less-than-perfect solutions.

There are no “well known solutions” because each problem is unique. Even within the same domain (say, social networks) you have a multitude of scaling challenges. Scaling profile pages is drastically different than scaling a page that contains a forum. Even for what superficially appears to be the same challenge (e.g. Profile pages) each company has different features and different approaches which means that, say, Google’s solutions for scaling Google+ profiles has very little in common with Facebook’s solution for scaling their profiles. Even when the functionality is similar, there are a multitude of factors, such as the business model, that drive what parameters you need to scale for each case. There’s simply no one-size-fits-all solution.

The people working to build our digital infrastructure are extremely talented, work extremely hard, and are facing problems that no one has faced before. This is not an excuse, it’s just a measure of the challenge, which we take on gladly. Sure, it’s not perfect, but it’s what we can do. It’s humans, flaws an all, running these systems. The alarmist and sensationalist tone in the New York Times article, coupled with the insinuation that the solution exists but no one wants to use it, is doing everyone a disservice. Solving these challenges requires continued work, incremental improvements, and a lot of focus.

Or, as Ben Horowitz quoted (in one of my favorite quotes of all time) Bill Turpin as saying during their days at Netscape: “There is no silver bullet that’s going to fix that. No, we are going to have to use a lot of lead bullets.”

Amen, brother.

Part 1 of a series (Part 2Part 3)

5.

I wasn’t planning on getting an iPhone 5 right away. I ignored the preorder midnight madness, and wasn’t even paying much attention to the date. Avoiding the IPHONE 5 LAUNCHES TODAY headlines everywhere was impossible though. So yesterday I diverted from my usual run route to go by the Apple Store (a few blocks from my house) and get a first look at the frenzy…

… and… there was no queue. It was only 9 am or so, about an hour after the store opened, but most of the people outside where Apple Store employees (blue shirts), Security (black shirts), and reporters (easily identifiable by trying to shove a mike or a camera in front of anyone). The store itself was full but not beyond the usual chattering mob. I asked one of the Apple guys if they still had iPhone in stock, secretly wondering if some kind of calamity had befallen the line earlier (Samsung retaliation? perhaps a well-aimed asteroid?) but no, it was all good. I went back home, got the credit card, came back, got the phone, a bit more than an hour after launch. It felt a bit like cheating for some reason.

It looks familiar, but as soon as you hold it, and touch it, and use it, it feels like a completely new device. Your eyes tell your brain that your hand should expect X and instead it gets Z. And it’s so light, it has me actually looking forward to the new iPod touch, which will be even lighter but share the base design.

It’s fast, the screen is great and the extra space is welcome and visually seamless when dealing with non-optimized apps. As Gruber speculated, the letterboxing of apps is unnoticeable for every day use on the black iPhone 5. Again: visually. As with the phone itself, while it looks similar on the surface, once you use it it’s another thing entirely. Muscle memory gets in the way the most for me with keyboards on letterboxed apps. We’ll see how fast I adapt. It’s not a huge problem, but definitely noticeable, on and off, particularly if you’re mindlessly switching between an app that is letterboxed and one that isn’t.

As for the EarPods, the jury’s still out. Unlike many other people, I’ve never had a problem with Apple headphones and used them as my main headphones, the EarPods didn’t seem immediately as comfortable but then again after using one type of headphone for 7-8 ears any change will be an issue. We’ll see.

Finally — no dock so far. Apparently Phil Schiller said that “Most people who use docks use them with speaker or clock systems.”. I would point out that there’s also, I don’t know, many, many developers for whom docks are a useful tool to connect to your desktop and laptop and hold the device while you’re working, especially when you take into account that there are app components that must be tested in the device (in app purchases, accelerometers, camera, etc.). I guess developers, even at a few million, would only be 1% of the iOS device market or so, but still, it would be nice if as a segment developers weren’t forgotten quite so easily.

what, despite appearances, is not at all about the new yorker magazine

It all became tricky when trying to read (comfortably, the thought asserted) “The Disappeared,” the piece on Salman Rushdie in the latest issue of The New Yorker. I tell myself I will read it in each device I have and in print, and see what works best. I ignore the Macbook Air as trite and move on to the iPad (“The New iPad”, echoing the relationship between New York and York). The New Yorker on iPad is a hulking beast and doesn’t lend itself to casual use: every issue has to be “downloaded,” all 130 or so megabytes of it, before you can peruse it. So I do. The progress bar starts to move and I can guess that the thing is coming down at half a megabyte a minute, a tenth of what my connection should allow, but that is irrelevant here. The issue is arriving at a speed that makes it feel as if there’s a physical process involved that is heavier, somehow more involved, than what it really is. It strikes me as a stunningly anachronistic concept to hang on to. It’s almost as if someone at the magazine was interpolating the notion that this magazine should be delivered to the reader that’s the way it’s always been and that’s the way it will be into a digital world that hasn’t as much discarded the concept as never really seen it as relevant and anything more than a temporary annoyance on the road to a future were everything is instantaneously here, no matter where it comes from and where here is.

At any other time, I would have walked away muttering at the metaphor, but I want to read this article and I am committed to this experiment and so I wait. In the meantime I wonder if I should instead read, first, from the actual magazine which the mailman also delivers, but suddenly I can’t find it. Curses! The issue finally arrives. Tap. Wait. It loads. There’s an ad of some sort. I swipe left into some other article. This always happens to me when reading The New Yorker on iPad. The magazine is organized in some sort of grid, ever-present but invisible, with articles on columns and article pages on rows (shouldn’t it be the other way around? or is it because it’s in landscape mode?). No matter, somehow through swipes and taps performed but unremembered I end up at the main index page, and tap on the Rushdie article. Off we go.

I relax into reading.

It’s only about three pages in that the battery alert pops up. I can’t remember the last time this happened, but it’s happening now, naturally. Fine, ok. Plug in the iPad. Look for the next vessel. How about the Nexus 7? I go find it and after the usual fumbling for the small etch of plastic that will turn it on (a hidden power button being one of the few  features that the 7 shares, improbably, with the Kindle Fire HD) I press. Nothing. Did I not do it right? Again. Nothing. Again… but this time I am paying more attention and I notice a tiny tiny TINY block of text at the top left of the screen, block of text in block letters like a 1980s arcade game. Something about the battery. Et tu, 7? I want to exclaim, but decide that I’ll leave that for a more momentous occasion. I just plug it in and move on to the Kindle, also known as the Kindle Fire HD.

At last, I know this one has a full battery, but I also know it doesn’t have, and may not even support, the New Yorker app, and what I need is the app, not the magazine, which is also available as a Kindle subscription but can’t (won’t) acknowledge that I am already a subscriber. I trigger search, which in trying to allow to search for everything, apps, newsstand, books, web, etc., has managed to be less convenient by obscuring obviousness with a jumble of choices. The first search fails, trying again I end up finding the Kindle magazine subscription, another one leads, somehow, to a search of books that match the term, and on the fourth try the app appears. I am taken to an Amazon purchase page that says this is free, and I marvel briefly at the idea that even if something is free, we still apparently have to purchase it, which I’m pretty sure is a mangling of the term… no matter. Fine. I purchase it. This forwards me to another screen with app details, the “cloud” view of amazon apps where I can tap another button to install. Yes, please. Install already. Tap, nothing happens. Delayed response? Tap again. Twice. Suddenly the button grays out and another progress bar is moving along.

The clock tells me that by now it’s been some twenty minutes of this existential void that is fumbling through the physical/digital landscape of tablets. And, I tell myself, I know exactly what I’m looking for, I know exactly what I want to install, from where, why. And it still takes this long. Not a good sign. Perhaps knowledge puts me at a disadvantage here, perhaps not knowing would just allow me to frustrate faster and then leave the thing be and go for a walk. How much am I paying for this…? While I’m pondering this the app has finished installing, and the button now says Open. Helpfully, there is also a notification that says the same and invites me to open the app as well. Thanks, invisible leprechauns! Got it. Open it is.

Inside, the magazine pretends to be all there and ready, like on iPad, but I know it isn’t. I know that the way to make this work is to look for the tiny “Sign In” text, top-left, really a hyperlink or frameless button if you will, that allows you to access your subscription. Ok. Tap. Keyboard appears, and a pathetically small box with scrawny fields where I am to input a username (really an email) and a password. The first attempt fails — but I typed it properly, I think. Careful inspection, meaning device nearly pressing against my nose to discern the distance between letters, shows that the system has helpfully split up the email address in two in the middle of the hostname. The leprechauns are working against me now. There’s a space, imperceptible almost, but there all the same. I try to tap to delete it, but the field doesn’t allow it. Tap-hold pulls up no loupe, instead, there’s a menu of choices that are all wrong for the moment — select all, cut, copy, and such. Tap again and even though there’s no cursor I decide that delete will probably do what I want. I delete all the way back to the space, eliminate it, type again. Sign in. Success.

Now, I must download the magazine again, this time into the Kindle, all 130 megabytes of it, so, tap and wait as the bits flow in. Here’s why the physical metaphor overload is not a good idea, I’d like to tell whoever thought of this. While the thing downloads, I go make myself some coffee.

– oOo –

After I’ve read the article, which is, incidentally, extremely good, I have no energy or interest left to waste time in tussling with the other devices. They are charging, anyway. Beyond the actual content of the article, what I’ve learned isn’t much, and I resist the easy path of thinking that the publishing and technology choices the New Yorker people have made are the main culprit. A different version of the experiment has been ongoing; with various publications in various forms, including The New York Times, the Atlantic… discovery, browsing, navigation all working differently in each medium… more notes on that another time.

At the moment though, the most interesting thing seems to be that the mechanics of the software and devices broke in loudly at every step of the way, and, if I wanted to just read the article, experiment be damned, I would have had to go through pretty much the same thing — or jump on the computer. An equal time of fussing about, wasted really, orthogonal to that which was the purpose — reading. It’s not just a matter of this magazine or that device because to truly make something disappear and minimize its mediating, the medium would have to have a humility that few in our modern media seem to have. Who has it? HBO comes to mind. But a lot of the web  seems to be incredibly insecure in its loud, overbearing jumble of advertising, cross-promoting and social network plugin obsession. I recognize that the source of security in the example and counterexample, HBO and a lot of the web, is rooted in revenue streams and business models and the demand for revenue and profit. It’s there nevertheless. That it has a source proximate or directly on top of that third rail of online businesses can’t be an excuse. Something has to change, no?

– oOo –

And this is really what I’m wondering about: how in our zeal for enshrining a winning ecosystem for the digital world, the ultimate purpose of all of this technology, all this effort, is somehow getting lost. To some degree, making the devices and software truly disappear and make way for the content is something that just isn’t all that common. Neither Android nor iOS, stand-ins for Google and Apple, do this well; both have put too much effort into exposing and making known the boundaries of their respective boxes– boxes in a Kantian sense (with apologies to Kant for stretching the metaphor). An ongoing battle requires battle lines to be drawn, and this isn’t really helping.

I’m not sure what the answer is, but I keep coming back to something has to change. If what we build is so obsessed in getting in the way of what we build it for and constantly congratulates itself in how good a job it is doing in ways big and small, we will make slow progress. A hammer that advertises itself as such every time you pick it up gets old quickly.

kindle fire hd review

A surprise today — my Kindle Fire HD pre-order arrived much earlier than expected. Amazon said October (“early October,” in that by now trite marketing-esque-speak). I don’t know if Amazon is trying to under-promise and over-deliver but if that’s the case they’re doing a fine job of it. That, or they’re trying to counter-program against Apple’s iPhone pre-orders, predictably the talk of the town, or whatever our modern equivalent of “the town” and “the talk” is. So, if the latter, they are failing miserably, at least judging from the top news around the web, which at the moment seem concerned with the fact that ship dates for pre-orders of iPhones have now “slipped to two weeks.” Not that I’m skeptical (in principle, I’m not), but this is exactly the kind of thing that could be triggered on purpose with some careful inventory management.

Back to the Kindle — the package arrived like all Kindles do, a box that to be opened requires you to tear it apart, as if wanting to induce the feeling of a wrapped present. The fact that Amazon chose to pre-label the thing as “Diego’s 7th Kindle” did not, in fact, made me feel good. It made me feel guilty. By numerating my gadget obsession it somehow exposed something that David Foster Wallace would have, probably, classified as an addiction. I quickly navigate Amazon’s site to rename it “Kindle Fire HD” and so be able to conveniently, perhaps too conveniently (DFW, echoing again), sidestep this minor shame.

Once on that “Kindle Manage” page, or whatever it’s called, I pay another $15 to get rid of the ads on the home screen, all the more visible since there’s simply no way a human can figure out where the wake button is. This seems like a minor inconvenience — at most, you just have to turn the device on its side four times, right? — but then you realize the physical action makes you hyper-aware of what pops up onscreen when you do find that button. I find myself wondering if buttons are this hidden on purpose. But of course they aren’t. It’s just an oversight. The usual. Then it strikes me, the real price of this thing is $214, not $199. Somewhere, someone, has figured out that the wake screen is worth $15, and that by pretending it’s $199 they will sell more. A concept to contemplate, at least briefly.

Once in it, I’m struck by how ordinary it all seems. Minor lack of polish here and there (is there such a thing as minor lack of polish, I wonder?). A first, hurried look reveals no obvious way to conjure up a web browser. What is the world coming to?

I find it, eventually. There’s the web. And there’s the books. And there’s Amazon Prime videos, and…

There’s something missing, a Fire, I dare say, from this device. It’s a fine piece of silicon and plastic and rubber all around, don’t get me wrong, but somehow it’s lacking conceit, and even though we often accuse Apple (however silently) of having too much of it, there’s also something to be said for having too little.

The Fire HD is a fine 7-inch tablet, even if both bulkier and heavier than the Google Nexus 7, at the moment its only worthy competitor. Not a general-purpose tablet, as I’ve said before, but another window into Amazon’s content. We will see if October brings with it an iPad mini, or iPad Air (as Gruber calls it), but in the meantime, the Nexus 7, sans content, rules the roost. Content included, it doesn’t. Somewhat, somehow, disappointing — but then again this form factor is less revolution than evolution, even if it was “invented” (read: done properly for the first time) barely two years ago, with the original iPad. It feels like more than that, doesn’t it?

new kindle – a quick review, and other thoughts

You may be thinking: New Kindle? The new tablet?

No, no that one. That’s out in November.

The Kindle with the “Paperwhite” display?

No, no that one either. That one comes out in next week but my order won’t ship until October.

This is perhaps the biggest problem I see with the new Kindle(s) Amazon announced last week: confusion. But more on that in a second. My answer to “should I get it?” also follows.

The new Kindle I’m referring to is the keyboard-less version of what Amazon now calls “Kindle e-Readers.” It’s the cheapest Kindle but it’s also the best suited, in my opinion, for long-form reading.

The best ebook reader

This new Kindle is a bit lighter than the old one, jet-black replacing the dark but dull gray of yesteryear. It is a bit faster. To my eyes, the screen looks a bit better. In other words, it screams INCREMENTAL IMPROVEMENT all around.

That’s fine with me, really. For a sub-$100 device that does exactly what it’s supposed to do and that remains, thankfully, focused primarily on a single function –reading–, this isn’t that far from the perfect device. The perfect device would be lighter but probably not much thinner, since if it was thinner it may get in the way of holding it properly.

This model is the primary device in which I read long-form writing. The news-reading king is the iPad, but for books the Kindle wins hands-down. It can be used under any light conditions, it’s lighter, and if the iPad’s battery lasts (seemingly) for an infinite amount of time, this Kindle’s battery is infinity plus one.

If you do a lot of long-form reading, this is the device to get, no question.

But.

Versions, versions, versions

That’s all well and good, but what about all the others? As I mentioned above, this is where the problem starts.

This is the list of the Kindles Amazon is selling at the moment. It’s important to note that these are the names that Amazon uses:

  • Kindle. Keyboardless e-reader with eInk display. WiFi only.
  • Kindle Paperwhite. Keyboardless touch-sensitive e-Reader with a backlit eInk display. WiFi, replaces 2011 Kindle Touch (which was, to be succinct, terrible, since it was heavier, bigger, and the touch model and responsiveness was all over the place).
  • Kindle Paperwhite 3G. Same as Kindle Paperwhite, but with global 3G connectivity.
  • Kindle Keyboard 3G. This is the 2010 Kindle, the only one of the current line to include a physical keyboard.
  • Kindle Fire. The original Kindle Fire released last year. “You mean, the one that is bulky, slow, heavy, etc?” Yeah, that one.
  • Kindle Fire HD. The new Kindle Fire with a higher resolution screen, faster processor, more storage, and MIMO (better WiFi, although that depends on your router).
  • Kindle Fire HD 8.9″. Same as the Kindle fire HD but with a bigger screen (smaller than iPad, iPad is 9.7″)
  • Kindle Fire HD 8.9″ 4G LTE Wireless. Same as the Kindle Fire HD 8.9″ but with 4G LTE (which requires a yearly subscription of $50 for a 250 MB monthly data plan).

You may be tempted to think that Amazon just decided to splay Kindle versions by feature or options, but that’s not the case. Every one of those models has options to choose from: Ad-supported (what Amazon called “With Special Offers”) or not for the Kindles, which lowers their prices, and with different amounts of memory for the Kindle Fire models.

For the any buyer –not just your “average” buyer which tends to appear, unicorn-like, in many reviews– this is incredibly confusing. Each of these products has their own product page, each touting the devices advantages. The “3G” or “4G” monikers in the product name force you to make a choice upfront for whether you want cellular wireless or not. Each product page references the other products, and the descriptions are very similar. It’s easy to click on the wrong product page by mistake and think you’re getting one thing while actually buying another. Price may offset confusion somewhat, and the fact that Kindles are less expensive than the alternative, iPad in particular, may entice some buyers to power through all the marketing nonsense and figure out what to buy anyway.

Amazon seems to have decided that strength is in numbers, Paradox of Choice be damned. I think that this is a mistake. It prevents simple, verbal recommendations from happening. “Should I get a Kindle?” is a common question I hear. There is no way to answer that quickly and with accuracy. The best you can do is extrapolate, assume they want an ebook reader, and say: “Yes, just get the cheapest one, less than $100.”

Compare that to the other typical question: “Should I get an iPad?” You can easily answer this with yes or no. The iPad is the iPad. True, there are choices to be made once you have decided to get one: price, memory, 3G or not. But the basic product choice has already been made. I, like most others, expect that if Apple announces a smaller iPad at some point it will be clearly distinguished, like, say, “iPad Mini,” but it won’t go beyond that.

What would be simpler?

My own preference for the Kindle line would be something like this. Three products: Kindle, Kindle Fire Mini, and Kindle Fire. (Since Apple hasn’t announced an “iPad Mini” yet, that would work, and it would make sure that stories about an iPad Mini, if it’s released, would also mention the Kindle Mini as a competitor). Drop the original Kindle Fire and the Kindle Keyboard (I have no doubt that Amazon keeps the Kindle Keyboard around because it sells, but my argument would be that at some point whatever you’re making from it is undercutting clarity of choice, and therefore sales from the newer devices). Make everything else an option. Forget about adding “HD” to the name: it will not age well.

Obviously this won’t happen now, but who knows, maybe next year. As for whether the new Kindle Fire models will be worthy iPad challengers, that remains to be seen, but the initial reviews suggest the same lack-of-polish software problems that the original Kindle Fire has. This is somewhat beside the point, however: Amazon’s beachhead into the tablet space is by making Kindle Fire a window into all Amazon content you own, as opposed to trying to match the iPad as a more general computing device. For that purpose, it’s a good tablet.

I’m a huge fan of Amazon as a company and the Kindle e-reader as a product. I also think that the Kindle Fire is important to provide a credible competitor to Apple, so I’m looking forward to seeing it evolve.

In the meantime, though, my recommendation remains the same: if you do a lot of long-form reading, get the cheapest Kindle. For everything else, there’s iPad. :-)

the process matters more than the outcome

“In the present situation of the United States, divided as they are between two parties, which mutually accuse each other of perfidy and treason…this exalted station [the presidency] is surrounded with dangerous rocks, and the most eminent abilities will not be sufficient to steer clear of them all.” Whereas Washington had been able to levitate above the partisan factions, “the next president of the United States will only be the president of a party.”

–Thomas Jefferson

The quote, from “Founding Brothers: The Revolutionary Generation” by Joseph J. Ellis, sounds like something that would fit well in place in today’s “highly polarized” politics. Only the grammatical structure and vocabulary (e.g., “perfidy”, “exalted”, “eminent abilities”) make it stand out from the more, um, succinct versions we would be likely to hear today (e.g. “You lie!”).

The obvious decay in the use of complex, rich language to convey and argue about complex, rich ideas, is perhaps one thing that has definitely changed in the last two hundred years and something that I wish could be reversed; and yet this particular is not restricted to politics. When longhand and carrier mail is replaced by 140 characters and texting, these things are bound to happen.

What has not changed is the emotionally charged, fiercely fought nature of political campaigns in America. If you take as a yardstick that Burr shot and killed Alexander Hamilton in a duel in 1804 as a result of a kerfuffle coming out of remarks in the middle of a campaign, I’d argue we still have a ways to go to get to “extreme partisanship.” And let’s not forget the kinds of statements on both sides of the line before, during, and even after the Civil War. Or the 60s, which saw Vietnam, riots, and the assassinations of three leaders, or… you get the point.

But surely, you’d say, when one party freely flings at another accusations of, say, fascism, we have crossed a line? After all, the 19th century didn’t yet know the suffering and horror of not one but two world wars driven by this ideology bent on domination? Surely the strange concept of accusing President Obama of being both a Fascist and a Socialist –two ideologies that, in their basis, flatly contradict each other– is new?

Not quite:

“In Central Europe the march of Socialist or Fascist dictatorships and their destruction of liberty did not set out with guns and armies. Dictators began their ascent to the seats of power through the elections provided by liberal institutions. Their weapons were promise and hate. They offered the mirage of Utopia to those in distress. They flung the poison of class hatred. They may not have maimed the bodies of men, but they maimed their souls.

[Roosevelt's] 1932 campaign was a pretty good imitation of this first stage of European tactics. You may recall the promises of the abundant life, the propaganda of hate.”

–Herbert Hoover, in a speech to the Republican National Convention, June 10, 1936

Hoover wasn’t alone in making this comparison. Other opponents of the New Deal were similarly apoplectic in their pronouncements. Imagine the power of this comparison right at the moment when these ideologies, now broken, where ascendant. While I am guessing that many people today may not have in mind the full context of the contradiction when they simultaneously accuse Obama of being an appeaser (with Iran), a Socialist, and a Fascist, you can bet that politicians using these analogies in the 1930s were well aware of the incongruity of their argument.

This wasn’t a temporary situation; these arguments resurface, time and again:

“Fascism was really the basis for the New Deal. It was Mussolini’s success in Italy, with his government-directed economy, that led the early New Dealers to say ‘But Mussolini keeps the trains running on time.’

–Ronald Reagan in May 17, 1976 Time Magazine.

And what about the flip side? Republicans now regularly praise President Truman (quite vocally during the 2008 election, during which they compared him to Sarah Palin, and Mitt Romney recently invoked him during a speech at the NRA) while decrying the charges that Democrats level at them as “class warfare”. But:

“I would like to say a word or two now on what I think the Republican philosophy is; and I will speak from actions and from history and from experience.

The situation in 1932 was due to the policies of the Republican Party control of the Government of the United States. The Republican Party, as I said a while ago, favors the privileged few and not the common everyday man. Ever since its inception, that party has been under the control of special privilege; and they have completely proved it in the 80th Congress. They proved it by the things they did to the people, and not for them. They proved it by the things they failed to do.”

–Harry S. Truman in his speech to the Democratic National Convention, June 15, 1948

Presumably, Republicans from 1948 would find it strange that current-day praise of Truman seems to ignore these types of pronouncements.

In all, most if not all campaigns wrap themselves in terms of near-life-and-death struggles. I personally remember clearly that since 2004, every single presidential election has been defined by both parties as “The most important election in our lifetimes.” And this isn’t new — you can find quotes stating the same for nearly most, if not all, presidential elections.

Why? Because it’s part of the process.

— oOo —

True democratic process is one of passionate, sometimes even extreme, arguments and grassroots efforts culminating in an election. If you lose, you regroup and prepare for the next. It’s true, I think, that the 24-hour news cycle and the Internet have increased the feeling of a “permanent campaign,” but we’ve lived with it for a couple of decades now and it hasn’t destroyed the process — if anything, it has supercharged it.

I don’t mean to imply that wildly exaggerated remarks, or extreme, false accusations are good. I don’t think they are. What I’m saying is that American democracy is vibrant precisely because of the energy, the tug and pull of politics in this country, and that inevitably that leads to some extremes. Even at the peak of demagoguery, every election, more or less, ends up becoming a fight of ideas and visions for the future. Do we need more or less regulation? Do we need a stronger safety net? What do we think about social issues?

What’s the alternative? There are actually many of course: a dictatorship on one extreme, anarchy on the other. And the US is nowhere near any of these extremes (fears of some people on the right notwithstanding). But to look at the more realistic ones, consider how democracies are working (or rather, not quite working) in other countries, for example in Russia, or Argentina. Russia is perhaps a difficult example because of the structural shock of the fall of the Soviet Union. Argentina, on the other hand, emerged from dictatorship almost three decades ago with a fairly strong two-party system. (Sidenote: whereas the US skews center-right, Argentina’s political system skews center-left. Someone like Obama would be firmly center-right in Argentina, and there’s really no equivalent to the Republican party in Argentinian politics). In the last three decades, for various reasons –too many to go into detail in this post–, the two party system in Argentina has been eviscerated. There is now, in effect, only one fully-functioning party, that used to be the Peronist party and now the (nominal) opposition is mostly composed of factions of that party. The result is a lack of argument and political discourse that is slowly but surely eroding democratic principles. What does this look like in practice? Consider this: President Cristina Fernández de Kirchner was re-elected without having to debate her opponents even once. This is no doubt expedient for the candidate that can get away with it, but it’s not good for the process, and in the end it’s arguably not good for the candidate either. Candidates benefit from having to present and defend competing visions of the future, and the political process in the US gives them, or rather, requires, plenty of that.

Additionally, the lack of a passionately vibrant political process and of clear choice (even if the choices seem to be worlds apart) leads whoever is in power to seek re-election just for the sake of retaining power itself, not to advance their ideas. It is no longer a matter of offering competing visions for the future of the country, since there’s no one to compete with, but of retaining power to maintain the status quo. Leaders become insulated and reluctant to change. Compromise, however feeble or minor it may seem in a “highly polarized” political environment like the US, simply disappears.

But isn’t compromise dead in the US? Republican stonewalling during the last few years would seem to prove that this is the case. But consider that George W. Bush emerged from his re-election with control of both houses of Congress, as did Obama in 2008. And yet, when they tried to pass legislation that the other party fiercely opposed (Social Security reform in Bush’s case, Health Care reform in Obama’s case) the results showed that even under one-party rule with a functioning two-party system some form of compromise must exist. Bush’s Social Security reform failed to pass. Obama’s Health Care reform passed, but significantly diluted from the progressive ideal of universal coverage that was the goal all along.

This is not an argument to ignore the very real differences between the alternatives or to engage in false equivalencies in which denying plain facts becomes, somehow, “a point of view.” So, like many others, I do worry when I see some extremes getting closer to the mainstream. I worry, for example, that many people today, mostly on the Republican side, deny the evidence that climate change is happening and, more alarming still, seem to be somewhat gleefully anti-science. Do I wish this wouldn’t be the case? Absolutely. I worry, yes, but I don’t despair, because I consider the alternative: not necessarily the extremes of anarchy or dictatorship, but one where the process by which democracy takes place has been subdued and even subverted and it becomes only superficial theater, where “elections” are won without debate or with 80 or 90% of the vote. In this sense, the laws that are intended to suppress turnout (and are now a point of contention) along with the perennial redistricting to make seats “safe” are to me more concerning than whether a group of people, however large, temporarily decides that Climate Change is somehow a well-orchestrated hoax among a bunch of scientists. They are more concerning because they strike at the heart of the process, making accountability harder and getting closer to the pursuit of power for power’s own sake, at any cost.

Today’s “50/50 nation” can be frustrating, and even scary at times, for all sides, but in that delicate balance lies the energy and vitality of modern democracy, and it is through the process of hard-fought campaigns ending in elections that ideas are advanced and evolve, rather than in one particular result. Results matter a great deal, but they are not definitive if the process works (however imperfectly) because it extends beyond that one result to affect all that follow.

In other words: in any one election, the process matters more than the outcome.

Follow

Get every new post delivered to your Inbox.

Join 366 other followers

%d bloggers like this: