On December 22, according to people familiar with the matter, Apple is stepping up the production of the head-mounted spatial computing device Vision Pro to prepare for the release in February. According to people who spoke on condition of anonymity, factories inside China are producing ** headsets at full speed and have been producing them for weeks. Their goal is to have the equipment ordered by customers ready by the end of January, with a plan for the launch and retail launch the following month.
According to Apple's announcement at WWDC23 this year, Vision Pro will be the first to land in the US market in early 2024, followed by other countries and regions. Since it has already been officially announced, Bloomberg reporter Mark Gurman also said that Apple should not hold another press conference. But as a new product, especially a device that costs up to $3,499, Apple should have a lot of brains to promote the product after that. Given the complexity and special features of the Vision Pro's devices, Apple Store employees may have a heavier task, not only guiding new users, adjusting the fit and preparing for installation, but also dealing with a key point - Apple and Zeiss have created custom optical lenses for the Vision Pro, and choosing the right lens for users can also increase the burden on store staff.
In order to make Vision Pro more quickly accepted by consumers, Apple not only convened the heads of Apple retail stores around the country to urgently train the "Vision Pro" related service content, but also some Apple retail stores have successively set up trial sites for Vision Pro, and even Apple has applied for relevant patents for placing Vision Pro.
In order to better adapt to this new hardware, along with the release of Vision Pro, the exclusive operating system VisionOS was also unveiled. Gurman noted that the next version of VisionOS should be released later in 2024 with the usual Mac and iPhone software updates. And in the update of the recently released iOS 172 After that, iPhone 15 Pro models will be able to shoot 3D encoded space at 1080p resolution at 30 frames per second, which requires the user to use Vision Pro.
The developers are also preparing for this launch. A few days ago, Apple sent out a notice telling developers to "get ready", and Apple invited developers to use Xcode 15, which was released last week2 beta version, calling the VisionOS SDK and Reality Composer Pro for development. In the email, Apple suggested that developers use the VisionOS emulator for testing, or to get feedback from Apple.
At this year's WWDC23, Apple has already proposed to gradually migrate the content of the iPhone and iPad app stores to the Vision Pro. In this year's July Developer**, Apple gave a detailed tutorial description. One of the easiest and most straightforward ways to do this is to open the built-in "Migrate to Vision Pro" option and publish your app directly in the Vision Pro App Store. However, it is obvious that not all applications can be directly adapted to the new hardware of Vision Pro, and many functions require targeted debugging by developers, so it will take some time.
There are all kinds of signs that Apple's Vision Pro is getting closer and closer to users. With a mission to "redefine spatial computing," the Vision Pro seems to have a magic that continues to attract the attention of developers and users across Apple's vast ecosystem. ios 17.2. Advance support space ** shooting is one of the many "pre-work" before the release, which can be said to have whetted the appetite of consumers. However, for ordinary users, it is estimated that it will take some time to experience the Vision Pro, after all, it will first be sold offline in the United States, and it will gradually expand to the global market later in 2024. It is estimated that the offline queuing time will be calculated in "days". But I believe that these are not a problem for fruit fans and friends who love cutting-edge technology, and we are also looking forward to the release of Vision Pro in February next year to see if this cross-era spatial computing device can bring what Cook called "epiphany moment".