Over the past several months, Meta has been revealing glimpses of a new virtual reality headset and some of the technology that powers it, part of the company’s strategy to “never surprise people” when they encounter the next wave of innovation. And today at its annual developers conference, Meta officially unveiled this new headset.
It’s still surprising.
The headset, called the Meta Quest Pro, is Meta’s next big (and bigly-funded) step into the metaverse—a computing future so important, Mark Zuckerberg said today, that he renamed the company after it. Formerly known as Project Cambria, this new VR headset has a slimmer profile than the previous Meta Quest 2. And it includes new optical technology that’s designed to make immersive computing more realistic and social.
The Meta Quest Pro apps shown off during press demos last week were a mixed bag. The color pass-through imagery—the information about the real world being taken in by the cameras mounted on the outside of the device—sometimes appeared aberrated at the edges. Using Horizon Workrooms, Meta’s app for conducting business in VR, felt awkward. (Some of Meta’s own employees are reportedly skeptical of chief executive Mark Zuckerberg’s broad vision for the metaverse, and are using Meta’s Horizon software less than expected.) Zuckerberg, in his keynote today, tried to position the metaverse as people-centered, rather than app-centric, because of the potential for social interactions. But social experiences rely on broader adoption of these virtual worlds.
The new Meta Quest Pro also costs $1,499, which might come as the biggest surprise. This is not a headset that’s accessible to most consumers, nor is VR in general far along enough to compel them to spend that much on a headset. The Meta Quest Pro is Meta’s attempt to prove that it can build this next generation of virtual reality computers, that real-time social interactions are possible in VR.
The result is a paradoxical computing platform, one that is technologically advanced and has the ability to catapult users into the virtual reality future, but still may not be the device to make VR totally mainstream. It is both a virtual reality and “mixed reality” headset. It’s a great escape from reality, but a good reminder that physical presence is better. Its apps are fun, but sometimes glitchy. The headset, which looks like a pair of high-end ski goggles, is comfortable at first; it also leaves a deep impression on your forehead after extended use.
Meta executives have touted the Meta Quest Pro’s next-generation optics. Because the device uses pancake lenses, it has a 40 percent thinner optical module, while the displays have greater pixel density. The headset also has continuous lens spacing, which adjusts the lenses to accommodate interpupillary distances. All of this allows for wearers to actually read text in VR, such as emails and messages, a crucial part of Meta’s pitch that we should all be wearing these headsets for work.
Another feature that Meta has emphasized is color pass-through imagery. On the Meta Quest 2, the real-time imagery captured by the headset’s external cameras—say, of your living room as you’re drawing a virtual boundary in VR—comes through in black and white. On the Meta Quest Pro, it’s in color.
The Meta Quest Pro allows gaps of light to come in through the sides of the headset, unlike the Meta Quest 2, which fully envelops the upper part of your face. Magnetic sidelight blockers, which ship in the kit, will turn it into a fully immersive heads-up display. This open periphery, combined with the image sensors on the headset, makes the Meta Quest Pro a mixed-reality device. You could be playing a game but still have eyes on the pets running into the room or the full coffee cup on the table. Or, with pass-through technology, digital objects or anchors might exist as overlays or portals to “the real world.”
In addition to new outward-facing image sensors, the Meta Quest Pro has inward-facing sensors that detect eye movements and other facial expressions. This means the face of your cartoonish avatar, as it interacts with other people in the metaverse, will mimic the expressions your actual face is making. It also means that Meta theoretically has the ability to capture emotional expressions from your headset.
Rupa Rao, a product management leader at Meta who presented the new headset to press last week, said that this feature enables users to be “your true authentic self, and use all of those vulnerable communication skills that we normally do, such as raising eyebrows, smiling, frowning, everything else.” These facial expression tools will also be available in a software development kit for app makers to use when building their apps.
When asked whether Meta or people developing apps for Meta VR are able to track emotions, another Meta product manager, Nick Ontiveros, said that Meta doesn’t infer emotions from the Application Programming Interface, the tool developers use to channel information into and out of their apps. “We’re just focused on movement. And when any app wants to make use of this API, they need to state clearly to the users how they plan to use it, and users always have the opportunity to, I guess, revoke permission or provide permission depending on the use case.”
Rao also pointed out that eye-tracking and facial expression features are turned off by default. If a Meta Quest Pro wearer does opt in to using them, the raw images are processed on the device itself and deleted after processing, she said.
Meta Quest Pro 2 runs on Qualcomm’s Snapdragon XR2+ chipset, which Meta claims gives it 50 percent more computing power than Meta Quest 2, along with better thermal dissipation (meaning your face might stay cooler). And the Meta Quest Pro’s hand controllers have a new design. Unlike the Meta Quest 2’s controllers, these don’t have hardware rings around them, they don’t rely on the headset for positioning, and you can use them in activities and games that require precision pinching, like poking at a virtual Jenga tower.
Jenga was just one of many games and experiences I tried at Meta’s Reality Labs headquarters in Burlingame, California, last week. I stretched my face in ways that would put Jim Carrey to shame as I tested the eye-tracking and facial expression features. It was surreal seeing a green elfin character, my avatar, mimic these expressions. I intentionally broke virtual toys. I scribbled notes on an imaginary notepad.
I got lost in painting a messy masterpiece, though I fumbled the paintbrushes. Then I hung the virtual painting on a real-life wall. In theory, pinching your fingers to pick up objects is a great thing to be able to do in VR. In practice, it takes … practice. Also, when I tried the painting app, I had to try it on three different headsets, because of what was described as the earthquake effect: The software would glitch and shake, and virtual paint cans would scatter around the room.
I took a live DJ lesson from a real-life DJ, although that person presented as an avatar (just like I did) and was somewhere else entirely, spinning turntables on what might as well have been a different planet. Florida? London? Who knows. I used awkward precision-pinching to turn the knobs and push some faders on my own virtual DJ mixer. The point of the demo was not to test my DJ abilities or even my interest, but to showcase how social presence would feel in a live VR tutorial. Similarly, in an app called Wooorld—two Os is a typo and three Os is an app name, its creator told me—I stood next to a friendly avatar named Paul while we played a game based off of Google Maps. The app would drop us somewhere, anywhere, in Europe. By using context clues and virtually traversing the Google Street View map, we would have to guess where we were. I really enjoyed this.
The final demo of the day was of Meta’s own app, Horizon Workrooms, which is currently in beta. This felt the most forced of all the VR apps I’d tried that day, in the sense that it tried to recreate common workplace interactions in VR and leaned heavily on that concept of social presence, though once again, everyone presented as a cartoon. Navigating an airy virtual conference room— even if the background is Aspen-like—and slapping a virtual Post-it on a virtual whiteboard so my virtual pal Jordan can remark on it doesn’t feel like much of an improvement over sharing a Google Doc on 2D screens.
Horizon Workrooms does let you cast three virtual monitors in front of your eyes, which is great if you don’t have the bucks or space to use three physical monitors at your desk. But its solution for a keyboard is to lay a virtual keyboard over a real-life one, which didn’t line up perfectly in my experience; or to have you peek below your headset to just use the real-life keyboard. At that point, I was relieved to take the Meta Quest Pro off.
Meta has managed to grab the overwhelming share of the market of VR headsets, thanks in no small part to its extremely capable $400 Meta Quest 2 headset (which at one point, was $300, and then went up in cost). Analysts estimate that Meta has sold more than 15 million units of the Quest 2 headset.
It currently supports more than 400 apps, all of which will run on the Meta Quest Pro, and third-party app makers are starting to generate revenue from Meta’s VR platform. 33 titles have made over $10 million in gross revenue, Meta said, and 55 titles have made over $5 million. Hundreds of thousands of people use Meta’s Horizon Worlds app in VR.
Still, by Meta’s own measurements, this is a small footprint. And at nearly $1,500, the Meta Quest Pro might not be the headset that expands that footprint greatly. As with a lot of futuristic computing experiences, at least two truths can be held at once: that something could be mind-numbingly cool and still not have a place in real life; that Mark Zuckerberg’s metaverse vision is worth paying attention to, while also being premature; and that it’s all literal fun and games until you take the headset off, and remember why it’s easy to feel uneasy about where the future is headed, and who might own it.
Update, October 11 at 3:20 pm: This story was updated to correct an error in attribution. The statement about how Meta is tracking movement and not emotions was given by Nick Ontiveros, Meta's product manager for Social Presence.