Meta has finally unveiled the much-anticipated Passthrough Camera API for Quest, a significant leap forward that grants developers direct access to the headset’s passthrough RGB cameras. This development is set to revolutionize mixed reality experiences on Quest, offering a more immersive and interactive environment.
Before this release, developers were somewhat restricted by the functionalities Meta had embedded into Quest’s passthrough cameras. The initial announcement about releasing the Passthrough Camera API was made during the Connect event in September, though a specific timeline was not provided at that time.
Now, with Meta XR Core SDK v74, this API is available as a Public Experimental API. It grants access specifically to the forward-facing RGB cameras of Quest 3 and Quest 3S models.
The potential of this API is vast. By tapping into the passthrough camera feed, developers can significantly enhance lighting and visual effects within their mixed reality applications. Furthermore, the integration of machine learning and computer vision into these feeds allows for sophisticated object recognition, reducing uncertainty in mixed reality content about user environments.
When the API’s release was initially announced, then-Meta VP of VR/AR, Mark Rabkin, highlighted its potential. He talked about it paving the way for “cutting-edge MR experiences,” including advanced object tracking, AI functionalities, intricate overlays, and deeper scene understanding.
This is the first time the API is widely accessible, though early versions were shared with select partners like Niantic Labs, Creature, and Resolution Games. These partners are presenting today at GDC 2025, where they’ll discuss their experiences and insights in a Meta session titled ‘Merge Realities, Multiply Wonder: Expert Guidance on Mixed Reality Development.’
It’s important to note that while this feature is experimental, it means developers can’t yet publish apps utilizing the Passthrough Camera API. Nonetheless, it seems Meta is continuing with their gradual approach, likely planning iterative updates leading up to a full release.
In addition to this groundbreaking API, the v74 update offers several other exciting features. New microgestures allow for intuitive thumb-based controls like taps and swipes. There’s also an Immersive Debugger that lets developers inspect Scene Hierarchy directly through the headset. Plus, new foundational elements have been added, including options for friends matchmaking and local matchmaking, setting the stage for a more robust developer toolkit.