Migration complete (part 1)

Alright – migration part 1 is complete. There’s a bunch of things I want to add later, including integrations, verify all feeds and some of the forwards, and so on.

For the moment, though, blog.diegodoval.com is back online, with both HTTP and HTTPS variants (HTTP always forwards to HTTPS) and a new shiny setup.

It’s a fully static site, but what software am I using? I was pretty convinced I’d be using Jekyll but it turns out that I found Hugo to be a smidgen simpler and a little bit easier to customize. The fact that it’s written in Go is not necessarily a plus in and of itself, but it makes some things a bit better (no need to tinker with gems).

I had an ulterior motive for embarking on the mess that is a blog migration: I’ve wanted to settle on a new solution to run blogs/static sites for other purposes, like n3xt. Once I start releasing software I don’t want to have to revisit this for a while. I’ll tackle some of the remaining upgrades soon, probably this weekend, but for now, it’s back to n3xt and its upcoming beta…

Migrating... please stand by...

I’m currently migrating from wordpress to a fully static site, something I’ve wanted to do for a while now. Things will be weird for a bit, hopefully no more than a few days!

Settling on a final version of the site’s new styles will definitely take longer than that though. :)

A16Z AI Playbook: TensorFlow iOS example quickstart

There’s another dimension to the iOS TensorFlow example in the A16Z AI Playbook: it provides a working example that can be used directly with Xcode 8 and Swift 3, which isn’t yet common. What follows is a set of pointers that (hopefully) make it easier for iOS+Swift developers to jump right into it.

1: Clone & Run The Sample App Unmodified

First, clone or download the playbook repository from github and then open the Xcode project found under _ai/ios/CueCard. _Keep in mind the project requires Xcode 8, and uses Swift 3.

First, open the project. You’ll notice that there’s a file missing:

graph.pb file missing!

graph.pb file missing!

Don’t panic! The contents of the file are there, just compressed, to get around github’s file size limits without using Git LFS. The TensorFlow static library is also compressed in this way.

To get these files in place, just build the project. There’s a build phase added to it that will check for the files and extract them from archive if they’re not in the correct location:

additional build phase to extract files

additional build phase to extract files

Now that you’re built the project the file will still appear in red because Xcode doesn’t update group contents dynamically. To have Xcode “see” the file you’ll need to close and open the project back up again.

Run The Project on a Device

With the project built you’re now ready to run it, and because the example uses the device’s camera, you’ll need an iOS device to run the example on.

After launch you should see a screen like this:

After pressing the button to switch to ‘scan mode’ you should see the bars change according to the prediction as you scan different items:

2: Train & Use Your Own Model

Now that you’ve got the app running, you can train your own model to detect different things.

TensorFlow Setup

The TensorFlow setup we use and how to get it up and running is covered in the section “Code Part 1: (Re-)Training a Model”. I suggest that even if you know TensorFlow and have a setup already running on your machine you take a look at this section to make sure that you adjust to any possible idiosyncrasies of the tutorial.

If you _haven’t _yet installed TensorFlow, I will repeat what’s said in this section of the playbook — as counterintuitive as it may be, you might have an easier time compiling TensorFlow from source than installing it.

_Finally — _you might be tempted to install and use the GPU extensions (e.g. CUDA) right away. The playbook includes instructions for this but if you are not familiar with TensorFlow and/or GPU extensions I don’t recommend spending time on that until you’ve got the whole setup working.

TensorFlow Model Retraining

Once you’ve got a working setup, the section “Code Part 2: Adding AI to Your Mobile App” takes you through how to retrain the model. The result of that process will be label files and the network graph that you can use to replace those in the iOS app.

Modify the iOS App Code

Once you have a retrained model and have placed it in the Xcode project, you are close to being done. In general, you shouldn’t need to modify code beyond what’s in _CueCardViewController.swift. _The code in that file is primarily written for simplicity and readability, and changes in interface, labels, and files are managed there.

Additionally, there’s the _TensorFlowProcessor _class (written in Objective-C) which wraps around the TensorFlow static library. Modifications that have to do lower-level parameters like image dimensions belong there.


By now you should up and running with both a TensorFlow setup that you can retrain and an iOS app that can be used to process images using your own trained network. Hope this is useful! And as always, please feel free to send questions, comments, or fixes my way.

Happy hacking! :)

(cross-posted on medium)

The A16Z AI Playbook is here!

Today is launch day for the The A16Z AI Playbook! Thanks to everyone that worked on it, in particular Frank Chen and Michael Wee. It was a privilege to be able to contribute and work with them on this project.

It’s an awesome contribution on the part of A16Z. It’s something that’s a bit unusual: different than just a website, or book, or set of samples, it’s all of those combined and more, since the entire thing is also available on githubunder the MIT License.

Late last year Frank asked me to propose a followup to a post+video he had authored: AI, Deep Learning, and Machine Learning: A Primer. The post and related talk have been very successful, for good reason: it’s concise, clear, and to the point.

To expand on that we focused on providing a more clear view of where and how to get started for non-experts as well as enough background knowledge and references to know what the next step should be, mixing in live code, data transferred, and a more complex example in the form of a full iPhone app. As explained on the site:

We’ve met with hundreds of Fortune 500 / Global 2000 companies, startups, and government policy makers asking: “How do I get started with artificial intelligence?” and “What can I do with AI in my own product or company?”

This site is designed as an resource for anyone asking those questions, complete with examples and sample code to help you get started.

While there are dozens of excellent tutorials available on the web (once you’ve figured out what library or API you want to use — we’ve listed a few of our favorites in the Reference section), we felt a pre-tutorial — a “Chapter 0” if you will, was missing: something that would help you survey the landscape broadly; to give you a sense of what’s possible; and help you think about how you might use artificial intelligence techniques to make your software smarter, your users happier, and your business better.

Go read it! I hope it’s useful and that it inspires other similar efforts. And if you find issues, errors, or things that can be improved please let us know.

(cross-posted on medium)

Irony.

A different way to think about what’s going on in the world today and what each of us can do about it.

_"America was, Wallace now knew, a nation of addicts, unable to see that what looked like love freely given was really need neurotically and chronically unsatisfied. The effect of Leyner's fictional approach to life — mutated, roving, uncommitted — like that of Letterman and Saturday Night Live — was to make our addiction seem clever, deliberate, entered into voluntarily. Wallace knew better. And now he was far clearer on why we were all so hooked. It was not TV as a medium that had rendered us addicts, powerful though it was. It was, far more dangerously, an attitude toward life that TV had learned from fiction, especially from postmodern fiction, and then had reinforced among its viewers, and that attitude was irony. Irony, as Wallace defined it, was not in and of itself bad. Indeed, irony was the traditional stance of the weak against the strong; there was power in implying what was too dangerous to say. Postmodern fiction's original ironists — writers like Pynchon and sometimes Barth — were telling important truths that could only be told obliquely, he felt. But irony got dangerous when it became a habit. Wallace quoted Lewis Hyde, whose pamphlet on John Berryman and alcohol he had read in his early months at Granada House: "Irony has only emergency use. Carried over time, it is the voice of the trapped who have come to enjoy the cage." Then he continued: "This is because irony, entertaining as it is, serves an almost exclusively negative function. It's critical and destructive, a ground-clearing….[I]rony's singularly unuseful when it comes to constructing anything to replace the hypocrisies it debunks. That was it exactly — irony was defeatist, timid, the telltale of a generation too afraid to say what it meant, and so in danger of forgetting it had anything to say." __— D.T. Max., Every Love Story Is A Ghost Story: A Life Of David Foster Wallace_

Knowledge and information break down barriers, and make dogma of almost any sort untenable. Religions and Systems of Thought, the “-isms” are all deeply vulnerable to knowledge and information.

This is one of the fundamental reasons why the world is currently in crisis.

The Information Age has given everyone (at least nominally) access to knowledge and information. In the process dogma, particularly but not exclusively religious, was badly damaged. Dogmatic beliefs used to provide an invisible pillar on which people could rely to construct the narrative of their lives. No longer. Absent that, many of us have been left scrambling, confused, and angry.

Once a civilization reaches the point we have, in which information moves with no context or clear purpose, we have to learn to cope with it, dismiss it, or, to put it bluntly, go mad.

This is not a conspiracy, or something that’s planned by anyone at any level, it’s evolution at species-scale, and this process will determine whether we can survive what Carl Sagan called our “technological adolescence.” At the time Sagan was, for good reason, most concerned with nuclear weapons. Since then though, we have added a few more doomsday devices to our arsenal, some of them even scarier, like biological weapons. To add to our neurosis we have also learned about a few more that we didn’t know about even before Sagan died in the 90s. Giant Asteroid? Old news, plus we have Bruce Willis for that. How about the Yellowstone Caldera, aka a Supervolcano? How about a Gamma Burst?

Faced with too much information you can’t do anything about, along with a deluge of changes both large and small, trying to hide and retreat from it is not entirely unreasonable. The so-called opioid epidemic in the US is not only a supply problem. The demand is there for a reason, beyond the pure addictiveness of a drug.

[caption id=“attachment_2413” align=“alignnone” width=“617”]irony_color XCKD: Irony[/caption]

A Wave That Started Long Ago

We live at a time when postmodernism (for lack of a better term) has burrowed deep into the back recesses of our brains and taken over, leading to a widespread embrace of irony, cynicism and eventually nihilism as basis of thought and expression.

This isn’t new and it was recognized early and clearly when it was emerging. The ‘cage’ quote from Lewis Hyde was mentioned by Wallace in an interview from 1993:

The irony, self-pity, self-hatred are now conscious, celebrated. […] If I have a real enemy, a patriarch for my patricide, it's probably Barth and Coover and Burroughs, even Nabokov and Pynchon. Because, even though their self-consciousness and irony and anarchism served valuable purposes, were indispensable for their times, their aesthetic's absorption by U.S. commercial culture has had appalling consequences for writers and everyone else.
Irony and cynicism were just what the U.S. hypocrisy of the fifties and sixties called for. That's what made the early postmodernists great artists. The great thing about irony is that it splits things apart, gets us up above them so we can see the flaws and hypocrisies and duplicities. […] Sarcasm, parody, absurdism, and irony are great ways to strip off stuff's mask and show' the unpleasant reality behind it. The problem is that once the rules for art are debunked, and once the unpleasant realities the irony diagnoses are revealed and diagnosed, then what do we do? Irony's useful for debunking illusions, but most of the illusion-debunking in the U.S. has now been done and redone. All we seem to want to do is keep ridiculing the stuff. Postmodern irony and cynicism's become an end in itself, a measure of hip sophistication and literary savvy. Few artists dare to try to talk about ways of working toward redeeming what's wrong, because they'll look sentimental and naive to all the weary ironists. Irony's gone from liberating to enslaving. There's some great essay somewhere that has a line about irony being the song of the prisoner who's come to love his cage.
— Burn, Stephen J., Conversations with David Foster Wallace (Literary Conversations Series) (p. 48). University Press of Mississippi.

What DFW and others identified decades ago has now metastasized: Well-followed Twitter morons “argue” in less than 140 characters that the earth is flat. Basic, plainly observable facts are being disputed with no rationale, and the scientific method has been appropriated and twisted.

The scientific method, a process designed to lead to high, but never absolute, certainty of ideas or a way of looking at the world, has been turned on its head to focus people on the possibility that what science says is not true rather than the high probability that it is.

_Ironically, _the increasingly widespread notion that there are no objective facts and that therefore everyone has a right to say or think whatever they want under any circumstances has led to reactionary behavior from people of all ages and creeds in which every group dismisses and tries to silence every other group. We can see everywhere the retrenchment of ideologies, a distrust and astonishing lack of curiosity (never mind discussion or acceptance) for things that challenge what we think or believe.

Past Peak Irony

Irony is powerful and perhaps, in some cases and when properly deployed, indispensable.

And yet sometimes it is something that must be actively, consciously fought. Not just when it is explicit but when we are confronted with its consequences, and that of other primarily post-modernist constructs, a pervasive lack of conviction or grounded opinion. It is maddening to me to be in a conversation in which the other person constantly trails off, attaching “like, you know”s and question marks at the end of sentences. We are, apparently, not supposed to have conviction anymore, and language tinted with this construct communicates that clearly. It says: I have nothing invested in this statement.

I wrote about _Every Love Store is a Ghost Story _a few years ago, saying that it could also have been subtitled “DFW’s Battles With Irony And Addiction,” although it didn’t deal exclusively with that of course, and I used the word “with” carefully, since it doesn’t univocally mean against. Irony and Addiction are somewhat intertwined, I think, and they can both be used as weapons, and they can both turn or be turned against ourselves.

In losing dogma we have lost certainty, but in these early years of the information flood we have embraced irony and replaced certainty largely with lack of conviction.

What we need instead is certainty that emerges from thought and reason instead of dogmatic repetition. We need to regain some measure of strength to our convictions so we are not constantly trying to just shout down whoever says something we don’t like. Each of us can choose whether to turn on the faucet of news and notifications and social networks and drown in it. There’s no easy fix, it’s a learning process. That’s the demand side.

On the supply side there’s many factors, but one of them is technology in general and software in particular. Building consciously and with conscience so that we aren’t simply creating addiction machines, but focusing on people. Creating mechanisms that prevent trolls and morons from dominating the conversation without requiring that they be “blocked” or just silenced. In the real world anyone can be a nut with all sorts of crazy ideas. You are free to just run around in your living room with your body covered in jelly, but you are not free to bring that into _my _living room, and today’s software tools and architectures usually allow you to do just that. There’s way to fix this, we just haven’t tried them seriously because in many cases they affect “the bottom line.” Which is why turning away from ridiculous models of “engagement” is critical.

That’s our challenge today, and whether we succeed or not will determine more than just who or what dominates the next wave of technology. It will also probably determine our survival, since it’s hard to fix problems if we can’t agree on what the problems are — or whether there is a problem at all.

The same people that built the atomic bomb were integral in creating a world order that would prove successful at containing its ultimate risk. We are seeing a new world emerge out of the information age, and it’s up to us to do the same.


Note: In Impossible I mentioned that the “[…] topic of humanity’s postmodernist funk and our seeming embrace of a zero-sum mentality deserves to be discussed in more detail.” This is the result.

(cross-post from medium)