Why is Apple using the term ‘spatial computing’ with its new Vision Pro headset?

By | February 2, 2024

SAN FRANCISCO (AP) — With Apple’s highly anticipated Vision Pro headset hitting store shelves Friday, you’ll likely start seeing more people wearing the futuristic Google glasses that will supposedly usher in the era of “spatial computing.”

It’s an esoteric mode of technology that Apple executives and their marketing gurus are trying to bring into the mainstream, avoiding other more commonly used terms like “augmented reality” and “virtual reality” to describe a product’s transformative powers. It has the potential to be as monumental as the iPhone released in 2007.

“We can’t wait for people to experience this magic,” Apple CEO Tim Cook said while discussing the Vision Pro with analysts on Thursday.

The Vision Pro will also be among Apple’s most expensive products at $3,500; At this price point, most analysts predict the company will only be able to sell 1 million or fewer devices in its first year. But Apple sold only 4 million iPhones in this device’s first year on the market and now sells more than 200 million iPhones a year; So there’s a history where what initially seemed like a niche product has evolved into something that has become intertwined with the way people behave. live and work.

If this happens in Vision Pro, references to spatial computing could become as ingrained in today’s vernacular as mobile and personal computing; Two previous technological revolutions in technology that Apple played an integral role in creating.

So what is spatial computing? It is a way to describe the intersection between the physical world around us and the virtual world produced by technology, enabling people and machines to manipulate objects and spaces in a harmonious way. Cathy Hackl, a longtime industry consultant who now works at a startup, said performing these tasks often involves elements of augmented reality, or AR, and artificial intelligence, or artificial intelligence (two subsets of technologies that help perform spatial computing). In applications for Vision Pro.

“This is a very important moment,” Hackl said. “Spatial computing will enable devices to understand the world in ways they never could before. It will transform human-computer interaction, and eventually every interface (whether a car or a watch) will become a spatial computing device.”

In a sign of the excitement surrounding the Vision Pro, more than 600 newly designed apps will be available immediately on the headset, according to Apple. The range of apps will include a wide range of television networks, video streaming services (although Netflix and Google’s YouTube are not on the list), video games, and a variety of educational options. On the business side of things, video conferencing service Zoom and other companies that provide online meeting tools have also developed apps for Vision Pro.

But Vision Pro could reveal yet another troubling aspect of the technology if the use of spatial computing becomes so compelling that people start to see the world differently when they’re not wearing the headset and begin to believe that life is much more interesting when viewed from behind. glasses. This scenario could worsen the screen addictions that have become common since the launch of the iPhone and deepen the isolation that digital addiction tends to foster.

Apple isn’t the only leading tech company working on spatial computing products. For the last few years, Google has been working on a three-dimensional video conferencing service called “Project Starline” that uses “photorealistic” images and a “magic window” to make two people living in different cities feel like they are in the same city. room together. But Starline still hasn’t been widely released. Facebook’s corporate parent company Meta Platforms has also been selling the Quest headset, which can be seen as a spatial computing platform, for years, but the company has not positioned the device as such until now.

Vision Pro, by contrast, is backed by the company with marketing prowess and customer loyalty that tends to trigger trends.

While Apple’s realization of its vision with Vision Pro has been heralded as a breakthrough, the concept of spatial computing has been around for at least 20 years. In a 132-page research paper on the subject published by the Massachusetts Institute of Technology in 2003, Simon Greenwold suggested that the automatic flushing of toilets was a primitive form of spatial computation. Greenwold supported his reasoning by stating that the toilet “detects the user’s receding movement to trigger the flush” and that “the interaction space of the system is a real human space.”

Vision Pro is of course much more advanced than a toilet. One of the Vision Pro’s most interesting features is its high-resolution screens, which play back three-dimensional video recordings of events and people, making encounters appear to be reliving. Apple has already laid the groundwork for selling the Vision Pro by including the ability to record what it calls “spatial video” in premium iPhone 15 models released in September.

Apple’s headset also reacts to user movements with hand gestures and eyes to make the device appear like another piece of human physiology. While wearing the headset, users will be able to use only their hands to pull up and edit a series of virtual computer screens, similar to the scene featuring Tom Cruise in the 2002 movie “Minority Report.”

“Spatial computing is a technology that begins to adapt to the user, rather than requiring the user to adapt to the technology,” Hackl said. “Everything needs to be very natural.”

We’ll see over time how natural it might seem to be sitting down to have dinner with someone else who wears glasses, instead of intermittently staring at their smartphone.

Leave a Reply

Your email address will not be published. Required fields are marked *