[ad_1]
June 5, 2023 will go down as the day Apple started something big. It’s different from what the world expected. It’s more expensive, it’s more ambitious, and it has a much longer runway.
But while the Apple Vision Pro looks like a set of magic ski goggles, it’s actually a computing platform that might eventually take over a lot of what we do today on smartphones, tablets, and computers. That’s because Apple made augmented reality the core of the product, rather than virtual reality.
At WWDC 2023, when Tim Cook announced that the Vision Pro was an AR headset rather than the expected VR headset, the live audience of developers and journalists at Apple Park fell into a shocked silence — and for good reason.
Here are my first impressions from on the ground at the event.
Also: Every hardware product Apple announced at WWDC 2023: Vision Pro headset, Mac Pro, more
AR is a much bigger deal than VR
Most of the expectations swirling around the launch of an Apple headset centered around it being a VR device with a touch of AR thrown in. The reality was exactly the opposite: Vision Pro is an AR headset that includes some VR-like capabilities.
VR is naturally constrained by the fact that you are largely cut off from the world around you when you put a VR headset on. That makes for immersive experiences that can transport you to another place, but it also limits the amount of time most people will spend in a headset to 30 minutes or less per day.
On the other hand, AR glasses could shrink down considerably over the next decade and become a digital display that is overlaid on top of the majority of your daily experiences.
Tim Cook called it “the first Apple product you look through, and not at.”
It unites the digital world and the real world
Vision Pro is actually a mixed reality headset. It combines AR and VR. But since the world already barely understands AR and VR — even though we’ve been talking about them for over a decade — it’s helpful that Apple didn’t confuse people by introducing a whole new term.
Instead, Apple talked about new ways that Vision Pro can unite the online world, where so many of us now spend so much of our time, with our everyday life. Cook characterized it as “seamlessly blending the real world with the digital world.”
Again, because AR overlays digital information on top of the real world, that opens up entire categories of content and experiences where developers can build on existing activities, professions, hobbies, and passions rather than having to digitally recreate them in VR.
“Vision Pro blends digital content into the space around us,” Cook summed up.
Also: The 4 best AR glasses: Pro-level AR and XR headsets
It’s ‘a new kind of computer’
One of the biggest surprises for me was the way Apple demonstrated different interfaces for Vision Pro that mimicked the iPad, the Apple TV, and the Mac. Honestly, the iPad and Apple TV interfaces didn’t surprise me, but the Mac did. Apple showed users creating the equivalent of giant multi-monitor Mac setups within Vision Pro.
Cook made no attempts to downplay the significance of this, even calling Vision Pro, “a new kind of computer” and saying, “In the same way that Mac introduced us to personal computing and iPhone introduced us to mobile computing, Apple Vision Pro will introduce us to spatial computing.”
He added, “With Vision Pro, you’re no longer limited by a display. Your surroundings become an infinite canvas.”
I’ve got serious questions about how this will work (which I’ll get to in a moment), but the fact that Apple is even exploring the Vision Pro as a work and productivity tool was the most unexpected and delightful surprise of the event. This gives the device a much broader set of possibilities than any AR or VR device we’ve seen so far — and will make it a lot more interesting to a lot more ZDNET readers.
Also: Apple just announced a ton of software features at WWDC. Here’s everything new
A couple big questions
Again, I have a lot of questions about how the Apple Vision Pro will work in the real world when it arrives next year for $3499. I’ll unpack more of those questions in the coming months as we learn more about the device and think through the possibilities.
For me, the most intense questions swirl around the concept of the Vision Pro as a virtual Mac desktop computer. I could run down a whole list of those questions, and eventually I’ll make that its own article. But for now, I’ll focus on the biggest one: how possible will it be to operate a virtual mouse and keyboard in that environment?
The ergonomics aside, moving your hands through the air to mimic mouse and keyboard motions might work for a few simple gestures like opening a website in Safari, but it’s not likely to be very useful for many extended tasks or more complex ones. Being able to pair a physical Magic Keyboard and a Magic Trackpad and use those within the virtual space is a big plus. Using an iPhone or an iPad as a keyboard could be a helpful option, too.
The other question that asserts itself the most in my brain is the extent to which the Vision Pro is meant to be a sedentary experience in your living room or den versus being able to move around with it in the real world. All of the WWDC demos looked very sedentary and limited to indoor spaces.
One of the strongest long-term appeals of AR glasses is the ability to take them out into the real world and overlay experiences like a hike to Half Dome in Yosemite Park. It feels like that kind of experience is still many years away, but I’d love to hear more of Apple’s vision for how this product will be the first step on the journey to those kinds of experiences.
[ad_2]
Source link