Published:

Tagged: AI-Glasses XR Even-Realities Opinion

AI-enabled XR glasses are having their Pebble moment

I had the opportunity to try Even Realities’ new G2 glasses this week. I went in half-expecting an iPhone moment, given the form factor and the marketing, and walked out with something different. It felt familiar, but not in the way I’d predicted. It felt like a Pebble moment.

I mean that as the highest compliment.

Why Pebble

The Pebble smartwatch landed in 2013, two years ahead of the Apple Watch, with a black & white e-paper display at 144x168 resolution, a battery that lasted a week and a peculiar charging connector. By the time Apple shipped, Pebble had built a community, an app store, a developer culture and a fiercely loyal audience.

The product was rudimentary. Things people would later call essential were missing. There was no proper health tracking, no real fitness story, no LTE, no app gating. The display couldn’t render anything richer than a notification and a glanceable workout. The hardware quietly broadcast that this was the early days of a new category.

And yet, Pebble shipped early enough to define what people wanted next: notifications on your wrist, music control without pulling your phone out, a timer you could glance at without looking rude in a meeting. It pioneered the idea that a watch could be a peripheral to your phone rather than a replacement for it. By the time the Apple Watch launched, the consumer use cases were already understood, the third-party developer instinct was already trained, and the basic shape of “smart” watch was already in collective memory. Pebble lost the market and shaped the category.

That’s worth dwelling on: Pebble was a forerunner, not a winner. It didn’t survive as a business in its original form. It still sells today (now rebooted) to an audience that knows exactly what it wants from it, but the market crown went to incumbents who arrived late with more capital, more scale, and more polish. The forerunner role is unglamorous, but it is the thing that makes the category legible in the first place.

That’s where AI-enabled XR glasses are now. Clearly early but at the start of something real.

The form factor breakthrough

Here’s the part that surprised me most: the G2 is the first pair of XR glasses that actually look and feel like glasses.

That sentence sounds modest. It isn’t. The category has been trying to clear this bar for a decade and it has, until now, looked like one of these:

Not ski goggles, not a visor, not a developer kit strapped to your face. Pebble looked like a watch; the G2 looks like eyewear. That’s not a small thing; it’s the difference between something you’d wear and something you’d demo. It’s the bar that decides whether a product gets shown off in a shop window or zipped back into a backpack the moment the press demo ends.

Once the form factor is right, everything else gets to be evaluated on its merits. Until then, every feature is competing with the social cost of looking like you’ve borrowed your goggles from a chemistry lab.

What I liked

No camera

The Even Realities products ship without a camera, and I think that’s the right call.

I don’t want one on my face and I don’t want one on anyone else’s. That’s the social argument, and it’s a real one. Cameras on glasses are perceived as intrusive in a way that cameras on phones are not, because phones telegraph their intent (you have to lift it, point it, frame the shot) and glasses don’t. Google Glass discovered this, expensively, in 2013. Snap Spectacles discovered it less expensively but no less clearly across multiple generations. Meta Ray-Ban has rehabilitated the camera-on-glasses idea somewhat but mostly because the camera is conspicuously visible and the form factor is so familiar that people read the device as a known quantity.

There’s a pragmatic point too: a camera on glasses is never going to rival the one already in your pocket. Phone cameras are now multi-lens, computationally enhanced systems backed by gigabytes of model weights and hundreds of milliamp-hours of dedicated battery. A glasses camera is, given the obvious physical constraints, smaller, with worse optics, less compute, and less power. It’s a worse camera than the one you already own, attached to a more socially uncomfortable form factor. It’s the wrong place to put a camera.

The smartwatch market told us this story already. The Apple Watch shipped cameraless while early Samsung rivals had cameras built in, and the market quietly settled on no-camera as the norm. Discretion won. Same gravity applies here.

The R1 ring is part of the product

You can buy the glasses on their own. You shouldn’t. The R1 ring is what turns them from glasses-with-AI into a usable input loop. I want to flag this because it’s the right answer to a question the category has been getting wrong for years: how do you control glasses without looking strange?

The candidate input paradigms are all problematic on their own:

A ring is a different answer. The input is invisible to anyone watching and the gestures map to glanceable actions rather than full keyboard equivalents, which is what glasses need. Together, the glasses and ring give you a control surface that doesn’t need a phone in your hand or a voice in thin air. The combination is what makes the experience work.

Treating the ring as part of the product (rather than an upsell accessory) is a quietly important design decision.

The AI on-device experience

The AI is on point: dictation, live conversation summaries, quick lookups. Not demoware, but the things I’d reach for daily.

This is the part that separates the G2 from the Pebble analogy slightly. Pebble’s killer feature was the notification, which is a passive surface. The G2’s killer feature is the AI, which is an active one. The product is at its best when it’s giving you a useful response to something you’ve just done: captured a thought, finished a meeting, glanced at a sign in a foreign language. The use cases that work are the ones where you’d otherwise have pulled out your phone, opened an app, waited, and broken the moment. The glasses collapse that loop.

Where the AI doesn’t work YET is anywhere it would on a laptop: long-form reasoning, multi-turn conversations, deep research. That’s fine. That’s not what glasses are for. The trap most XR products fall into is trying to be a full computer on your face. The G2 doesn’t try, and that’s part of why it works.

What I didn’t like

There were two things, both structural rather than product flaws.

This is going to be a constraint on how fast Even can scale, and it’s also going to be a moat against competitors who don’t think about distribution this way. Whether that trade-off survives contact with mass-market demand is one of the more interesting open questions about the company.

What comes next

Two predictions.

1. AI-enabled XR glasses are coming. Not in five years. The loop is closing now.

The G2 is not the device that wins this category. The device that wins this category will be made by a company with the scale to manufacture at consumer-electronics volume, the brand to clear retail shelves and the AI infrastructure to deliver the on-device experience without lag. The candidates are obvious: Apple, Meta, Google, possibly Samsung. None of them have shipped this product yet. Apple is heavy on hardware and light on shipping pressure and AI. Meta has Ray-Ban scale but a camera trade-off the market may eventually punish. Google is on its third or fourth attempt at this category and may finally have learned. Samsung has the supply chain but rarely the product taste.

The wildcard is whether a smaller player like Even can ride the early-adopter market long enough to scale into the mainstream before an incumbent copies the form factor. Pebble couldn’t, and that’s the cautionary tale. But the form factor advantage Even has is real, and the optician distribution channel is harder to replicate than it looks. They have a runway. Whether it’s long enough is the question.

2. The winning form factor will look more like the G2 than like anything with a camera array bolted to it.

This is a longer-term prediction/wish/hope: discretion will matter more than capability. The history of consumer wearables is the history of devices that quietly faded into the background of daily life: wristbands became watches, watches became phones-on-the-wrist, earphones became invisible. The form factors evolved differently, but the same thing happened underneath each one: more capability, more software, more sensors, all hidden inside a device the wearer wanted to keep wearing.

The G2 isn’t the endpoint of that trajectory. But it’s the first product in this category that’s pointed in the right direction. That’s the Pebble moment. Not the device that wins, but the device that makes everyone else realise what winning looks like.