Apple Vision Pro detailed disassembly and technical interpretation

Mondo Technology Updated on 2024-03-03

The Apple Vision Pro is a mixed reality headset that combines virtual reality (VR) and augmented reality (AR) technologies to provide users with an immersive and interactive experience. According to the reference information provided, this device performs well in terms of display effect, interactive experience, etc., but it still needs to be improved in terms of battery life and weight. This time, we will disassemble the vision, analyze its structure and construction, as well as some of the technologies used.

Of course, the glass plate was glued to it and it took a lot of time and effort, but we took it off and it didn't break. Of course, it didn't come out unscathed – there was a protective plastic film on the glass, which peeled off a little and probably melted a little. Apple's retail repairmen may be faster than ours, but they'll charge you $799 for a replacement.

1. Heat the glass shell.

2. Pry open the corners.

3. Separate the glass shell.

4. The whole picture of the glass shell. Weighing in at 34 grams, this glass may not be heavy on its own, but with the battery, the Vision Pro weighs more than a kilogram.

The Vision Pro battery pack alone weighs 353 grams and consists of three iPhone-sized batteries for a total charge of 359Wh, it's the iPhone 15 Pro 17More than twice as much as 3Wh. Each battery itself weighs 184 grams, which is surprisingly only about half the weight of the entire battery pack. To get in, we had to soften some of the peripheral adhesive, release a set of disposable metal clips, and unscrew a large number of Torx screws.

Three cells in an aluminum battery pack, each connected in series for about 38V, 3166 mAh per battery, total voltage of 1134 volts.

The speakers are secured to two rigid bands attached to the main headphones. To unleash these, you can use our old friend sim card removal tool. The holes are located inside the side supports of the main earphone and are detachable with a row of electrical contacts, just like the lighting connectors. Parts that are easy to disassemble? Just need a tool you may already have? We love watching it. This makes us hope that turning on the headphones may not be as intimidating as we initially thought.

This modular design is similar to the AirPods Max, which we like so much. Wearables can be easily damaged, so it makes sense to have speaker modules that are easy to replace. We tried to take it a step further and pry the speaker out of the silicon frame and immediately snapped the molded cable inside. It doesn't matter, you don't need to pry the speaker module open.

Speakers – not as hard to get into as AirPods Pro, but almost.

The speaker itself is pointed at your ear. This is a very clear sign that you should not wear it anywhere there is noise. You can wear AirPods Pro if you want – they must be the latest USB-C version if you want lossless, low-latency audio.

On the left side is the proprietary battery cable connection, which snaps into place with a magnet and then twists to lock. We understand why Apple is using a non-standard connector here, even if we don't like it – at least it won't be pulled out by a passing child, or when the wire inevitably gets stuck in your chair. But the plug on the other end of the cable is unforgivable. Instead of terminating with a USB-C plug, it connects to the battery pack with what looks like a proprietary oversized lightning connector, which you can release using a paperclip or SIM card removal tool.

Lightweight seals and face cushions.

Every face is different, and Apple is selling 28 different light-sealing parts to cover all the different face sizes and shapes. If you need a ZEISS lens insert, the size of your seal will also change. This is because seals and pads are also used to ensure that your eyes are positioned correctly relative to the stereoscopic screen and eye sensor. That's why Apple hand-packs every Vision Pro order – just without the "standard" settings.

The seal is attached to the main earphone by a magnet, which is Apple's – it either sticks in place or is very easy to replace. This modularity is a brute force attempt to achieve the ideal facial fit. It will be interesting to see if this is needed for the long term, or if future devices can find an easier way to do this. Currently, magnets are better than Velcro because they allow the seals to be precisely aligned. Think about how MagSafe snatches the charger and lines it up perfectly on your iPhone's inductive charging coil.

Eyesight display.

The front glasses case is the defining feature of the Vision Pro, and now reviews are pouring in, and it is also one of its most controversial features.

Eyesight's patent describes three display modes: "Internal Focus", "External Engagement", and "Do Not Disturb". The patent has pages and pages of images that may be displayed on the screen — the eyes of various animals, biometric analysis captured by other sensors, and the user's heart as they talk to their lover. The internal camera can read emotional states and project images based on those emotional states.

Calm thoughts. In fact, the Eyesight display is so dark and low resolution that reviewers say it's hard to see much on it. Wall Street**'s Joanna Stern called it "hard to see," and Marques Brownlee (aka MKBHD) said, "When I'm wearing headphones, you can barely see my eyes." ”

It turns out that when EyeSight shows your eyes, it doesn't just show a single part of your eyes; It shows a bunch of ** of your eyes. Exploring inside the glass case, we find a display with three front-facing faces: the widening layer, the lens layer, and the OLED display itself.

Why do the eyes look so strange?

Apple wanted to implement something very specific: a 3D animated face with eyes. In order to achieve this, they had to make very strategic design choices and compromises.

The human brain is very sensitive to faces and expressions, which is why Mystic Valley is a thing, part of which is depth perception. Apple needed to create a believable 3D effect. One reason why 3D renderings don't look really 3D is because they lack a stereoscopic effect. In order for something to look 3D, we need to see subtly different images with each eye. Vision Pro solves this problem with a biconvex lens.

When viewed from different angles**, the convex lens displays different images. You can use this effect to simulate the movement of two frames of an action. Alternatively, you can create a stereoscopic 3D effect with images of the same subject from different angles.

The Vision Pro has a lens layer on top of the external OLED panel. VisionOS renders multiple images of faces—called A and B—slice them and show A from one angle for the left eye and B for the right eye from the other. This creates a 3D face with a three-dimensional effect. These angles are small, and many, and it takes a peculiar scientific microscope of evidence to really understand what we mean.

The curved ridge of the columnar lens layer.

There is a compromise with this approach. The horizontal resolution is significantly reduced, and each image in the plurality is divided. For example, if two images are displayed on a 2000-pixel wide monitor, there are only 1000 horizontal pixels available for each image. Although we don't know the resolution of the display or the number of images intertwined, the resolution is bound to decrease. This is also the main reason why Eyesight's eyes look blurry.

In front of the lens layer is another plastic lens layer with a similar lens ridge. This layer seems to stretch the projection surface to a wide enough width to fit the width of the Vision Pro. Removing this layer and booting up the pro shows some very strange eye pulling.

In addition, lenses may limit the effective viewing angle. Restricting the effect to the front of the Vision Pro limits the artifacts you see at extreme angles, a bit like a privacy filter. On the downside, you're passing an already complex, blurry image through another layer of lens. This makes it even more blurry and dark.

Vision Pro X-ray transmission map

When you put on the Vision Pro for the first time, it itself automatically adjusts the IPD and the motor adjusts the position of the lenses. Others have prescription lenses.

The Apple Store has a machine that, when you come to the demonstration, can determine the approximate strength of prescription glasses. For users with eye conditions that can interfere with eye tracking, such as strabismus, Vision Pro provides alternative interactive controls in accessibility. However, we have heard that people with astigmatism (40% of the total population) cannot use **glasses.

The prescription insert lens itself needs to be "paired" with the headphones. This decision has already resulted in a terrible user interface, with John Gruber's review department receiving a miscalibration** that resulted in poor eye tracking performance. In principle, we hate parts pairing, and there has to be a way to calibrate while allowing third-party lenses.

The motherboard of the Vision Pro.

R1 and M2 The headset runs on the M2 MAC chip and works in tandem with the new R1 chip, which is dedicated to handling inputs from 12 cameras, LiDAR sensors, and TrueDepth cameras, all with minimal latency. With AR, you need to project a real-world camera view into the user's eyes as quickly as possible, otherwise their perceived motion won't match what they see, which is the fast track to Vomitsville.

To keep up, R1 uses a real-time operating system. This means that tasks are always executed within a fixed amount of time. Most of our computers run on time-sharing operating systems, which dynamically schedule tasks and can cause slowdowns. Think a jittery mouse cursor, or a spinning beach ball, and you get the idea. This won't apply to something as critical as pass-through and object rendering. Any glitch there will be like a glitch in the matrix, harsh at best, disgusting at worst. It can even cause you to fall.

The Vision Pro is heavy, the glass is fragile, and the external battery can be annoying. But Apple has managed to pack the power of the Mac, coupled with the performance of a new dedicated AR chip, into a computer that can be worn on your face.

It's not good in terms of repairability, but on the bright side, some connections are good. For example, when our removal team realized that the side arms could be ejected using the SIM card removal tool, you should have seen them jump, and the magnetic pads were more user-friendly.

It feels like Apple isn't up to their own standards on the Eyesight screen. The Eyesight screen is dim, the resolution is low, and it adds a lot of volume, weight, complexity, and expense to the most weight-sensitive parts of the headset.

In any case, we are sure that it will be a difficult decision to bring it to market, and perhaps in the coming time, the updated Vision Pro will be even more satisfying.

Related Pages