Facial tracking on Quest Pro now includes tongue movements.
In last year's test of the Quest Pro, sticking out your tongue broke the illusion when looking at your facial expressions in a virtual mirror, and the absence of tongue movements also limited the expressiveness of social VR.
In its version of the SDK v60 for Unity and Native**, Meta has released a new version of its face-tracking OpenXR extension, which includes detecting the length of a user's tongue sticking out.
The Meta **ATARS SDK has not yet updated the tongue detection feature, but third-party **ATAR solutions can support it after updating their SDK version to v60.
Alvr developer Korejan recently demonstrated this new feature based on the SDK V60 and VRCHAT.
Steam Link and Virtual Desktop support Quest Pro facial tracking, which can be passed to VRCHAT, but both apps need to be updated to SDK v60 to support tongue tracking.
Don't expect to see other people's tongues in many standalone apps, though. Quest Pro facial tracking has hardly any third-party adoption on the Quest Store. The Meta Horizon suite fully supports it, but VRCHAT's standalone app only supports eye tracking, while REC Room and BigScreen still don't support face or eye tracking.
According to reports, this may be due to the poor sales performance of the Quest Pro, which has led to mixed third-party adoption. The Quest Pro has been dropping from $1500 to $1000 just four months after its launch and has been giving away for free in recent months. However, the Quest Pro continues to receive software support, with more recent mixed reality performance improvements and now tongue tracking.