Long before we called it the metaverse, I'd been deeply interested in augmented and virtual reality. Specifically, I've spent a lot of time thinking about what it will mean to have a computing device in front of our eyes when we are out and about in the world i.e. some sort of smart glasses. In 2017, I built a simple AR prototype to try it out.

Here's a thread that captured some of my thinking at the time.


Today, we use our computers when we are stationary and looking to accomplish larger, more immersive tasks (like creating a deck for work). We use our phones when we want to quickly dip in and out of software without losing context about our physical surroundings (like answering a text).

I see VR and AR roughly* as descendants of these. VR will be used for the sorts of things we might use a computer for today, and AR for times when we would use a phone today. Unlike computers/phones, these might end up being the same device in two different modes, but no one knows exactly what it will look like yet. I wrote a bit about that here– The last screen you'll ever need.

*I say roughly because they will not be 1:1 replacements, and they will be able to do things well beyond what is possible today, but it's a useful framework to start.

For AR to work, the digital elements on the screen need to blend seamlessly with the physical world. Perhaps one day this will be done with holograms or brain implants, but likely it will start with passthrough devices, meaning that you will see a screen that displays physical reality through a series of cameras and sensors.

You can try it right now by just opening the camera on your phone and holding it up in front of your face. You can still see the world, but it's through the camera lens. Smart glasses will be like if you taped your phone to your face.

That's exactly what I wanted to try out five years ago, as I was starting to put together a unit for my PM course at Columbia. (A precursor to the metaverse course I will hopefully be teaching in the fall...) I wish I could find other pics from the day, but I recently came across this gem which reminded me about it.

I had downloaded a VR camera app on my iPhone which basically splits the camera signal into a stereoscopic view. Then I ordered one of those "make your phone a VR headset" things. And voila, I had a passthrough AR device ready to go.

I decided to take a stroll through NYC. And yes, I had others (my siblings and frequent partners in crime), with their zero latency natural eyeballs, join me so that I wouldn't die. We walked down a busy sidewalk, crossed the street, and went into a coffee shop where I ordered a coffee from a perplexed barista. This photo is of me on my way back to the office afterwards.

Of course, everyone I passed on the street thought I was a jackass. Which was... technically accurate. But I was used to those stares– it was the same when I wore Google Glass or a variety of other early AR glasses I've tried over the years. Obviously real smart glasses will look a lot better than this, but I do think it is the case that they will in fact look dumb at first. Early adopters won't care (or like me, will like the attention), and then eventually everyone will get used to it and join in. AirPods is a good analogue– a lot of early reviewers said no one would ever want those white things sticking out of their ears. 🤔

After successfully getting my coffee, back at the office, we all took turns with the headset, trying different tasks like catching a ball, pouring liquid from one cup to another, etc. I vaguely remember trying to play ping pong? The visibility and latency were terrible. Hilarity ensued.

Now skipping ahead to the present, these sorts of devices are getting closer and closer to being ready for primetime. When I saw that Meta/Oclulus had introduced its Passthrough API last year, my head basically exploded. It's really happening. These are the building blocks. The days of the smartphone are numbered, and soon, we'll all happily look like a jackass.