Author: Zhou Yuan Wall Street News.
If nothing else, Apple will create a new era of intelligent terminal interaction. Just like on January 9, 2007, Steve Jobs released the first generation of iPhone, a new form of interaction is coming.
Apple's current CEO, Tim Cook, has recently told ** that Apple's Apple Vision Pro (MR) "has the potential to usher in a whole new era of spatial computing, with the same profound impact on the world of how we work, communicate and experience as the iPhone did 16 years ago."
If there is only hardware and no software, it will not exist in the PC era or the mobile Internet intelligent terminal interaction era. Therefore, the Apple Vision Pro also requires application software just like the iPhone.
As the first person personally selected by Jobs, Cook spared no effort in implementing the concept of software and hardware integration pioneered by Steve: in order to develop application software for the Apple Vision Pro, according to Apple's regulations, Apple Silicon Mac computers must be used.
According to public information, Apple's Vision Pro sales target in 2024 will exceed 1 million units, and the production capacity plan for 2024 will exceed 4 million units.
If the sales target of more than one million units can be realized, then the new interactive revolution brought by Vision Pro will rewrite the current competitive landscape of consumer electronics.
Due to its high price, lack of portability, and unsatisfactory battery life, the Vision Pro did not sell more than 1.5 million units in a quarter as soon as it was released like the original iPhone 16 years ago.
In fact, the original iPhone was not perfect when it was first released.
For example, the whole machine has not been tested, the integration of software and hardware is defective, the application software (APP) is too small, the communication signal is very poor, and so on. However, the original iPhone has epoch-making technical and application significance, and is a very groundbreaking product.
Therefore, despite many shortcomings, such as Apple's early core executive Louis Cassie's evaluation that "the first iPhone was a waste", the sales of the original iPhone still exploded, and in the following eight years, sales exceeded 500 million units, so that the phone "changed the rules of the game in the mobile phone industry forever".
It may not be appropriate to compare the Vision Pro with the sales of the original iPhone in the first quarter of its launch, because the Vision Pro has not yet been sold.
Wall Street has noticed that the Vision Pro is moving faster than the original iPhone in that at the same time as the release of the Vision Pro, Apple launched the developer kit Xcode 15 beta and four months later, launched an iteration of Xcode 151 beta。Ten months after the launch of the original iPhone, Steve Jobs revealed that a companion software development kit would be available for it.
Of course, when the first iPhone was first released, the App Store wasn't yet live. Jobs's strategy of integrating software and hardware at the level of the original iPhone product has just begun, and it is far from being played to the extreme.
What is the integration of software and hardware?What is the value of this to Apple?
Steve Jobs often said that Apple's core competitiveness is to launch complete devices, and the integration of hardware and software allows customers to have an unparalleled experience. Therefore, the integration of software and hardware, in simple terms, is the seamless integration of software and hardware, with the aim of achieving an unprecedented "Apple experience".
Jobs believes that every interaction between users and Apple, such as browsing, ordering, payment, logistics, receiving, application, after-sales and security, etc., will affect users' perception of Apple. So make sure that all interactions are impeccable.
To do this, Apple must control all aspects of the chain and ecology in order to achieve Jobs's vision that the only value of Apple's existence is to bring consumers products that can change the world.
The significance of this product to users also includes a full range of user experience - from technology development, product definition, industrial design, and physical and offline to help users achieve an excellent experience.
Guided by this thinking, on October 4, Apple notified Vision Pro app developers that developing apps for VisionOS and Vision Pro headsets requires a Mac equipped with Apple Silicon.
Steve Jobs likes to be in control, which is why the iOS system is a closed ecosystem.
Implicit here is the question: under what circumstances does a user have to, or must actively desire, a particular terminal?The answer is simple, and this has been proven in the past.
For example, if you want to be legal in the United States between 2003 and 2007, you must have an iPod.
Now, developers of applications based on VisionOS or Vision Pro who want to run their business activities with this new operating system and the new interactive smart devices running on top of this system must have a Mac equipped with Apple Silicon. This is also a new development in Apple's software and hardware integration strategy.
Apple believes that only by controlling all user-related interactions can it ensure that its unique and excellent experience is implemented.
xcode 15.1 Beta is a package for Apple devices that includes SDKs for iOS 17, iPadOS 17, tvOS 17, WatchOS 10, macOS Sonoma 14, and VisionOS.
Although Apple is optimistic about the future value of the Apple Vision Pro, its exorbitant price ($3,499 per retail price) may cause Apple's expectations to fall short of expectations.
Recently, it was reported that Apple's Vision Pro sales guidance for 2024 exceeded 1 million units. Previously, it was reported that Apple's Vision Pro shipments are planned to be 4 million + units in 2024 and exceed 10 million units in 2025.
Judging from Cook's expectations for Apple's Vision Pro, sales are not at all the focus of this "Attila, the Hun king of the inventory world". Cook's expectations for Apple's Vision Pro are staggering: the Vision Pro will replace the iPhone in the next 10 years.
In fact, the iPhone has subverted the traditional mobile phone industry by enabling a very innovative way of interaction (multi-touch). If Vision Pro wants to replace the iPhone, the breakthrough must also be the innovation of interaction methods.
In fact, the Apple Vision Pro supports multiple interactions such as eyes, hands, and voices. Users can browse the app by looking directly at it, click on the void with their fingers to select, twist their wrist to scroll or use voice input, and also support Apple's Magic Keyboard and Seconds Mouse to achieve human-computer interaction.
In Steve Jobs' words, in any case, the interaction between users and technology can be "simple and magical" through Apple's ongoing efforts to innovate technology.
Compared to smart devices, especially smartphones, the Vision Pro positioning spatial computing device is equipped with the new VisionOS system and can provide a 3D interface. Users can freely adjust the size and position of the application, breaking the physical boundaries of traditional displays.
It's hard to imagine what a new experience would be with a new form of interaction until you see and experience the Vision Pro.
Developers have repeatedly emphasized Apple's upcoming Vision Pro headset in early 2024 that apps like Night Sky will be very impressive on the Vision Pro, an immersive surround display headset.
"We have developer labs in London and Munich, and we're seeing some incredible work. ”
Andy Weeks has developed the Night Sky app: this one uses augmented reality technology that appears on the screen when users point to stars, planets or even the location of the International Space Station in the sky using their iPhone. Night Sky can send reminders and messages so that users can share with friends and family what they see, or even what they can't see.
The first time I experienced the Vision Pro headset, it was an 'aha moment'. "I'm now more convinced of the far-reaching implications of spatial computing." When you try it for yourself, you will feel like you have a sudden realization. In a person's lifetime, such moments are rare. ”
The aha moment is also translated as "epiphany moment" or the "eureka effect" (i.e., the birth of great ideas).
Aha Moment was first proposed by the German psychologist Kalbler more than 100 years ago. At that time, the definition was a special, enjoyable experience in which the user suddenly gained insight into a previously unclear situation.
Like Steve Jobs, Cook was a practitioner of the idea of "simplicity from complexity". Steve Jobs believed that good products should be simple to use, and the design should have artistic and emotional power. At the software level, in addition to the artistic and emotional soul, Cook believes that the core of this idea is the simplicity and ease of use of the tool.
At the technical and industrial level, Cook sees Vision Pro headsets as having the potential to usher in a new era of computing. Like the iPhone, Apple has built a platform for developers to help technology developers develop world-changing apps. Vision Pro is expected to have a revolutionary impact on work, communication, gaming, and more, and Apple will further solidify its industry leadership with it.
Even if the Vision Pro can really reach the sales target of 1 million+ units in 2024, it is not the same as the sales frenzy set off by the original iPhone.
However, the significance of Vision Pro is not in sales, but in the market education, user feedback, ecological follow-up and new opportunities in the industry brought by the new interactive methods.
Cook says he believes in the far-reaching implications of spatial computing brought to life by Vision Pro. "Just as the Mac brought us into the era of personal computing, the iPhone brought us into the era of mobile computing, the Apple Vision Pro will take us into the era of spatial computing," Cook said. ”
What is Spatial Computing?At present, this concept is mainly about the microscopic properties of MR.
Zhao Xing, president of the Joint Research Institute of Metaverse and Virtual-Reality Interaction, told Wall Street, "Spatial computing includes a series of theories, technologies and tools for collecting, acquiring, processing and interacting three-dimensional (3D) multimodal data. ”
The key technologies of spatial computing include 3D reconstruction, spatial perception, user perception, and spatial data management. From the perspective of computing power carriers, spatial computing can also be divided into terminal computing, cloud computing, and cloud-edge-end collaborative computing with 5G as an important medium.
Through spatial computing, it can realize the seamless street connection of people, things, terminals and virtual space, so as to build a digital twin, create a new economic form of virtual and real integration, and ultimately drive a new round of industrial transformation and promote the development and implementation of MR and other metaverse-related industries.
Different from traditional desktop computing and mobile computing, spatial computing is not limited to the rectangular box of the physical screen, but can flow freely in the environment around the user, including voice, vision, gestures and other more natural input methods will further enrich the traditional interaction modes such as mouse, keyboard, touch screen, etc., so that people can achieve access and interaction in the most suitable way for their current scenarios and business processes.
It's still too abstract to describe the technology in words. If you look at it in a different way, what kind of technical principle or process is it to experience Vision Pro?
Vision Pro is Apple's first 3D camera, equipped with 1 LiDAR and 2 depth sensors, which are used to handle functions such as SLAM spatial environment perception, gesture recognition, 3D modeling, etc., and can capture space and space in 3DThen, through immersive spatial audio, the user can re-experience the emotional moment of the scene that has just been filmed, and the location of the user's real physical space is 3D and integrated into that 3D space to become a part of it.
It's an immersive and interactive experience you've never seen before. This new experience builds on the work of 3D sensors, including lidar and depth sensors, that create, blend and accurately represent the user's surroundings, allowing Vision Pro to accurately render digital content in space.
Once the market is educated and as much early user feedback as possible, spatial computing will reshape everything related to visual or virtual-real interactions, such as office, entertainment, e-commerce, healthcare, engineering, and education.
An executive at a VR technology company in Shanghai told Wall Street, "Apple's Vision Pro is too expensive, and there will be replacement products in China." "This will objectively accelerate MR's market education.