Meta has a major opportunity to win the AI hardware race

Meta has a major opportunity to win the AI hardware race

Source: The Verge

AI wearables have had a cruddy year.

Just a few short months ago, the tech world was convinced AI hardware could be the next big thing. It was a heady vision, bolstered by futuristic demos and sleek hardware. At the center of the buzz were the Humane AI Pin and the Rabbit R1. Both promised a grandiose future. Neither delivered the goods.

It’s an old story in the gadget world. Smart glasses and augmented reality headsets went through a similar hype cycle a decade ago. Google Glass infamously promised a future where reality was overlaid with helpful information. In the years since, Magic Leap, Focals By North, Microsoft’s HoloLens, Apple’s Vision Pro, and most recently, the new Snapchat Spectacles have tried to keep the vision alive but to no real commercial success.

So, all things considered, it’s a bit ironic that the best shot at a workable AI wearable is a pair of smart glasses — specifically, the Ray-Ban Meta smart glasses.

AI is but ONE feature on the Ray-Ban Meta smart glasses. That turned out to be clutch.

The funny thing about the Meta smart glasses is nobody expected them to be as successful as they are. Partly because the first iteration, the Ray-Ban Stories, categorically flopped. Partly because they weren’t smart glasses offering up new ideas. Bose had already made stylish audio sunglasses and then shuttered the whole operation. Snap Spectacles already tried recording short videos for social, and that clearly wasn’t good enough, either. On paper, there was no compelling reason why the Ray-Ban Meta smart glasses ought to resonate with people.

And yet, they have succeeded where other AI wearables and smart glasses haven’t. Notably, beyond even Meta’s own expectations.

A lot of that boils down to Meta finally nailing style and execution. The Meta glasses come in a ton of different styles and colors compared to the Stories. You’re almost guaranteed to find something that looks snazzy on you. In this respect, Meta was savvy enough to understand that the average person doesn’t want to look like they just walked out of a sci-fi film. They want to look cool by today’s standards.

At $299, they’re expensive but are affordable compared to a $3,500 Vision Pro or a $699 Humane pin. Audio quality is good. Call quality is surprisingly excellent thanks to a well-positioned mic in the nose bridge. Unlike the Stories or Snap’s earlier Spectacles, video and photo quality is good enough to post to Instagram without feeling embarrassed — especially in the era of content creators, where POV-style Instagram Reels and TikToks do numbers.

Meta’s AI is sometimes finicky and inelegant, but it works on the device in a natural way.
Screenshot by Victoria Song / The Verge

This is a device that can easily slot into people’s lives now. There’s no future software update to wait for. It’s not a solution looking for a problem to solve. And this, more than anything else, is exactly why the Ray-Bans have a shot at successfully figuring out AI.

That’s because AI is already on it — it’s just a feature, not the whole schtick. You can use it to identify objects you come across or tell you more about a landmark. You can ask Meta AI to write dubious captions for your Instagram post or translate a menu. You can video call a friend, and they’ll be able to see what you see. All of these use cases make sense for the device and how you’d use it.

In practice, these features are a bit wonky and inelegant. Meta AI has yet to write me a good Instagram caption and often it can’t hear me well in loud environments. But unlike the Rabbit R1, it works. Unlike Humane, it doesn’t overheat, and there’s no latency because it uses your phone for processing. Crucially, unlike either of these devices, if the AI shits the bed, it can still do other things very well.

These look cool and normal.

This is good enough. For now. Going forward, the pressure is on. Meta’s gambit is if people can get on board with simpler smart glasses, they’ll be more comfortable with face computers when AI — and eventually AR — is ready for prime time.

They’ve proved the first part of the equation. But if the latter is going to come true, the AI can’t be okay or serviceable. It has to be genuinely good. It has to make the jump from “Oh, this is kind of convenient when it works” to “I wear smart glasses all day because my life is so much easier with them than without.” Right now, a lot of the Meta glasses’ AI features are neat but essentially party tricks.

It’s a tall order, but of everyone out there right now, Meta seems to be the best positioned to succeed. Style and wearability aren’t a problem. It just inked a deal with EssilorLuxxotica to extend its smart glasses partnership beyond 2030. Now that it has a general blueprint for the hardware, iterative improvements like better battery and lighter fits are achievable. All that’s left to see is whether Meta can make good on the rest of it.

It’ll get the chance to prove it can next week at its Meta Connect event. It’s a prime time. Humane’s daily returns are outpacing sales. Critics accuse Rabbit of being little more than a scam. Experts aren’t convinced Apple’s big AI-inspired “supercycle” with the iPhone 16 will even happen. A win here wouldn’t just solidify Meta’s lead — it’d help keep the dream of AI hardware alive.



Read Full Article

Leave a Reply

Your email address will not be published. Required fields are marked *