QuestCameraKit is a collection of template and reference projects demonstrating how to use Meta Quest’s new Passthrough Camera API (PCA)
for advanced AR/VR vision, tracking, and shader effects.
- Overview
- Getting Started with PCA
- Running the Samples
- General Troubleshooting & Known Issues
- Acknowledgements & Credits
- Community Contributions
- License
- Contact
- Purpose: Convert a 3D point in space to its corresponding 2D image pixel.
- Description: This sample shows the mapping between 3D space and 2D image coordinates using the Passthrough Camera API. We use MRUK's EnvironmentRaycastManager to determine a 3D point in our environment and map it to the location on our WebcamTexture. We then extract the pixel on that point, to determine the color of a real world object.
- Purpose: Convert 2D screen coordinates into their corresponding 3D points in space.
- Description: Use the Unity Sentis framework to infer different ML models to detect and track objects. Learn how to convert detected image coordinates (e.g. bounding boxes) back into 3D points for dynamic interaction within your scenes. In this sample you will also see how to filter labels. This means e.g. you can only detect humans and pets, to create a more safe play-area for your VR game. The sample video below is filtered to monitor, person and laptop. The sample is running at around
60 fps
.
1. 🎨 Color Picker | 2. 🍎 Object Detection |
---|---|
![]() |
![]() |
- Purpose: Detect and track QR codes in real time. Open webviews or log-in to 3rd party services with ease.
- Description: Similarly to the object detection sample, get QR code coordinated and projects them into 3D space. Detect QR codes and call their URLs. You can select between a multiple or single QR code mode. The sample is running at around
70 fps
for multiple QR codes and a stable72 fps
for a single code.
- Purpose: Apply a custom frosted glass shader effect to virtual surfaces.
- Description: A shader which takes our camera feed as input to blur the content behind it.
Todo
: We have a shader that correctly maps the camera texture onto a quad, and we have one vertical blur shader and one horizontal blur shader. Ideally we would combine all of these into one shader effect to be able to easily apply it to meshes or UI elements.
3. 📱 QR Code Tracking | 4. 🪟 Frosted Glass |
---|---|
![]() |
![]() |
- Purpose: Ask OpenAI's vision model (or any other multi-modal LLM) for context of your current scene.
- Description: We use a the OpenAI Speech to text API to create a coommand. We then send this command together with a screenshot to the Vision model. Lastly, we get the response back and use the Text to speech API to turn the response text into an audio file in Unity to speak the response. The user can select different speakers, models, and speed. For the command we can add additional instructions for the model, as well as select an image, image & text, or just a text mode. The whole loop takes anywhere from
2-6 seconds
, depending on the internet connection.
OpenAI.vision.whisper.model.mp4
- Purpose: Stream the Passthrough Camera stream over WebRTC to another client using WebSockets.
- Description: This sample uses SimpleWebRTC, which is a Unity-based WebRTC wrapper that facilitates peer-to-peer audio, video, and data communication over WebRTC using Unitys WebRTC package. It leverages NativeWebSocket for signaling and supports both video and audio streaming. You will need to setup your own websocket signaling server beforehand, either online or in LAN. You can find more information about the necessary steps here
6. 🎥 WebRTC video streaming |
---|
![]() |
Information | Details |
---|---|
Device Requirements | - Only for Meta Quest 3 and 3s - HorizonOS v74 or later |
Unity WebcamTexture | - Access through Unity’s WebcamTexture - Only one camera at a time (left or right), a Unity limitation |
Android Camera2 API | - Unobstructed forward-facing RGB cameras - Provides camera intrinsics ( camera ID , height , width , lens translation & rotation )- Android Manifest: horizonos.permission.HEADSET_CAMERA |
Public Experimental | Apps using PCA are not allowed to be submitted to the Meta Horizon Store yet. |
Specifications | - Frame Rate: 30fps - Image latency: 40-60ms - Available resolutions per eye: 320x240 , 640x480 , 800x600 , 1280x960 |
- Meta Quest Device: Ensure you are runnning on a
Quest 3
orQuest 3s
and your device is updated toHorizonOS v74
or later. - Unity: Recommended is
Unity 6
. Also runs on Unity2022.3. LTS
. - Camera Passthrough API does not work in the Editor or XR Simulator.
- Get more information from the Meta Quest Developer Documentation
Caution
Every feature involving accessing the camera has significant impact on your application's performance. Be aware of this and ask yourself if the feature you are trying to implement can be done any other way besides using cameras.
-
Clone the Repository:
git clone https://github.com/xrdevrob/QuestCameraKit.git
-
Open the Project in Unity: Launch Unity and open the cloned project folder.
-
Configure Dependencies: Follow the instructions in the section below to run one of the samples.
1. Color Picker
- Open the
ColorPicker
scene. - Build the scene and run the APK on your headset.
- Aim the ray onto a surface in your real space and press the A button or pinch your fingers to observe the cube changing it's color to the color in your real environment.
- Open the
ObjectDetection
scene. - You will need Unity Sentis for this project to run ([email protected]).
- Select the labels you would like to track. No label means all objects will be tracked.
Show all available labels
person bicycle car motorbike aeroplane bus train truck boat traffic light fire hydrant stop sign parking meter bench bird cat dog horse sheep cow elephant bear zebra giraffe backpack umbrella handbag tie suitcase frisbee skis snowboard sports ball kite baseball bat baseball glove skateboard surfboard tennis racket bottle wine glass cup fork knife spoon bowl banana apple sandwich orange broccoli carrot hot dog pizza donut cake chair sofa pottedplant bed diningtable toilet tvmonitor laptop mouse remote keyboard cell phone microwave oven toaster sink refrigerator book clock vase scissors teddy bear hair drier toothbrush
- Build the scene and run the APK on your headset. Look around your room and see how tracked objects receive a bounding box in accurate 3D space.
- Open the
QRCodeTracking
scene to test real-time QR code detection and tracking. - Install NuGet for Unity
- Click on the
NuGet
menu and then onManage NuGet Packages
. Search for the ZXing.Net package from Michael Jahn and install it. - Make sure in your
Player Settings
underScripting Define Symbols
you seeZXING_ENABLED
. The ZXingDefineSymbolChecker class should automatically detect ifZXing.Net
is installed and add the symbol. - In order to see the label of your QR code, you will also need to install TextMeshPro!
- Build the scene and run the APK on your headset. Look at a QR code to see the marker in 3D space and URL of the QR code.
- Open the
FrostedGlass
scene. - Make sure in your render asset the
Opaque Texture
check-box is checked. - Build the scene and run the APK on your headset.
- Look at the panel from different angles and observe how objects behind it are blurred.
Warning
The Meta Project Setup Tool (PST) will show a warning and tell you to uncheck it, so do not fix this warning.
- Open the
ImageLLM
scene. - Make sure to create an API key and enter it in the
OpenAI Manager prefab
. - Select your desired model and optionally give the LLM some instructions.
- Make sure your headset is connected to the internet (the faster the better).
- Build the scene and run the APK on your headset.
Note
File uploads are currently limited to 25 MB
and the following input file types are supported: mp3
, mp4
, mpeg
, mpga
, m4a
, wav
, and webm
.
You can send commands and receive results in any of these languages:
Show all suppported languages
Afrikaans | Arabic | Armenian | Azerbaijani | Belarusian | Bosnian | Bulgarian | Catalan | Chinese |
Croatian | Czech | Danish | Dutch | English | Estonian | Finnish | French | Galician |
German | Greek | Hebrew | Hindi | Hungarian | Icelandic | Indonesian | Italian | Japanese |
Kannada | Kazakh | Korean | Latvian | Lithuanian | Macedonian | Malay | Marathi | Maori |
Nepali | Norwegian | Persian | Polish | Portuguese | Romanian | Russian | Serbian | Slovak |
Slovenian | Spanish | Swahili | Swedish | Tagalog | Tamil | Thai | Turkish | Ukrainian |
Urdu | Vietnamese | Welsh |
- Open the
WebcamToWebRTC
scene. - Link up your signaling server on the
Client-STUNConnection
component in theWeb Socket Server Address
field. - Build and deploy the
WebRTC-Quest
scene to your Quest3 device. - Open the
WebRTC-SingleClient
scene on your Editor. - Build and deploy the
WebRTC-SingleClient
scene on another device or start it the Unity Editor. More information can be found here - Start the WebRTC app on your Quest and on your other devices. Quest and client streaming devices should connect automatically to the websocket signaling server.
- Perform the Start gesture with your left hand, or press the menu button on your left controller to start streaming from Quest3 to your WebRTC client app.
Troubleshooting:
- If there are compiler errors, make sure all packages were imported correctly.
- Open the
Package Manager
, click on the + sign in the upper left/right corner. - Select "Add package from git URL".
- Enter URL: https://github.com/endel/NativeWebSocket.git#upm and click in Install.
- After the installation finished, click on the + sign in the upper left/right corner again.
- Enter URL https://github.com/FireDragonGameStudio/SimpleWebRTC.git?path=/Assets/SimpleWebRTC#upm and click on Install
- Open the
- Make sure your own websocket signaling server is up and running. You can find more information about the necessary steps here.
- If you're going to stream over LAN, make sure the
STUN Server Address
field on[BuildingBlock] Camera Rig/TrackingSpace/CenterEyeAnchor/Client-STUNConnection
is empty, otherwise leave the default value. - Make sure to enable the
Web Socket Connection active
flag on[BuildingBlock] Camera Rig/TrackingSpace/CenterEyeAnchor/Client-STUNConnection
to connect to the websocket server automatically on start. - WebRTC video streaming does NOT work, when the Graphics API is set to Vulkan. Make sure to switch to OpenGLES3 under
Project Settings/Player
. - Make sure to DISABLE the Low Overhead Mode (GLES) setting for Android in
Project Settings/XR Plug-In Management/Oculus
. Otherwise this optimization will prevent your Quest from sending the video stream to a receiving client.
Warning
The Meta Project Setup Tool (PST) will show 2 warnings (opaque textures and low overhead mode GLES). Do NOT fix this warnings.
- Some users have reported that the app crashes the second and every following time the app is opened. A solution described was to go to the Quest settings under
Privacy & Security
and toggle the camera permission and then start the app and accept the permission again. If you encounter this problem please open an issue and send me the crash logs. Thank you! - If switching betwenn Unity 6 and other versions such as 2023 or 2022 it can happen that your Android Manifest is getting modified and the app won't run anymore. Should this happen to you make sure to go to
Meta > Tools > Update AndroidManifest.xml
orMeta > Tools > Create store-compatible AndroidManifest.xml
. After that make sure you add back thehorizonos.permission.HEADSET_CAMERA
manually into your manifest file.
- Meta For the Passthrough Camera API and Passthrough Camera API Samples.
- Thanks to shader wizard Daniel Ilett for helping me set up the
FrostedGlass
sample. - Thanks to Michael Jahn for the XZing.Net library used for the QR code tracking samples.
- Thanks to Julian Triveri for constantly pushing the boundaries with what is possible with Meta Quest hardware and software.
-
Tutorials
- XR Dev Rob - XR AI Tutorials, Watch on YouTube
- Dilmer Valecillos, Watch on YouTube
- Skarredghost, Watch on YouTube
- FireDragonGameStudio, Watch on YouTube
- xr masiso, Watch on YouTube
- Urals Technologies, Watch on YouTube
-
Object Detection
-
Shaders
-
Environment Understanding & Mapping
-
Environment Sampling
-
Image to 3D
-
Image to Image, Diffusion & Generation
-
Video recording and replay
-
OpenCV for Unity
-
QR Code Tracking
This project is licensed under the MIT License. See the LICENSE file for details. Feel free to use the samples for your own projects, though I would appreciate if you would leave some credits to this repo in your work ❤️
For questions, suggestions, or feedback, please open an issue in the repository or contact me on X, LinkedIn, or at [email protected]. Find all my info here or join our growing XR developer community on Discord.
Happy coding and enjoy exploring the possibilities with QuestCameraKit!