Pass through expirience?

I have a question to the community or to Cory. How is the experience with pass through?

Do the asset work on Pico4 stand alone ?
Or do we need to go for quest pro and use it linked ? Or is the super high price Varjo headset the solution ?

Thx in advance
Greetings Martn

I got passthrough working with Quest 2 (obviously a bit low grade due to black and white/grainy cameras)

Hi, @MartnD


We haven’t tested ourselves on any Pico devices, but we have been able to build APK’s that run untethered on the Meta Quest 2 and Quest Pro (see above GIF). Because of Android-powered XR device hardware constraints, we recommend constraining the Depthkit asset to 2048x2048 video, and using the Depthkit Studio Lite renderer to make the build more performant.

Setting up passthrough for Quest builds is relatively straightforward, and in our experience, the refresh rate of the passthrough layer seems to be uncoupled from Unity scene framerate, ensuring that the passthrough layer is always silky-smooth.

That helps already … thx @CoryAllen @Psicon_Lab … I write at the moment an application to an museum which offers an artistic grant. I developed an concept to their main overall topic and I would love to use depthKit studio for the realization as I have anyway the hardware in place … it is possible that the museum is against meta /Facebook devices … we will see … I will not specify the Headset and maybe I test first to build on Pico4 if it’s not working I will switch to QuestPro as I believe they will like it more in color passthrough … The varjo headset would eat together with the VRPC a lot of the potential grant so I hope to avoid it … I guess around mid or end of may I will know if I get the artistic grant and then I will share here my process … I hope it will workout. I want to do since ages an own artistic museum expierience with depthKit studio, even when I love to be part of teams from time to time an own project is good for the soul :wink:

1 Like

@MartnD I just demonstrated the VIVE XR Elite on Sunday and it’s color passthrough is quite impressive! It’s also very comfortable and compact device. It may be a good alternative to Pico and Quest Pro.

I’m sure @PhilipHan who is in our Depthkit community and works at HTC would be excited to see you using it :slight_smile:


@James, as you have tried the XR Elite, you may already share some basic information.

The XR Elite is a headset that is, on one hand, standalone and PCVR-able through wireless streaming, as I understand from the homepage.

Were you able to see a Depthkit clip backed into a ready-backed unity project running at the XR Elite standalone?

My planned project will be an entire assembly of recorded characters to be simultaneously played. So I guess I have to use the “Lite renderer” anyway.

Still, maybe there is a limit to how many recorded depthkit full body recordings can run simultaneously using the lite renderer at the XR elite.

Potentially this hasn’t been tested yet.

I learned that the potential “go” to the grant would be pretty late, coming in June, but this gives time to investigate the challenge of the simultaneously running lite renderer use case.

Maybe @PhilipHan and I have to talk as part of my research and prep for this planned project.

@MartnD Multiple characters being rendered simultaneously requires some performance optimization to run on untethered devices:

  • You may need to scale some or all of your video assets down to 2048x2048 or even 1024x1024, and update each clip’s metadata accordingly.
  • The Studio Lite renderer will likely be necessary on all clips, but you can experiment with applying the standard Studio renderer on one or two featured characters, and applying Studio Lite to the others if the hardware supports it.
  • For the standard Studio renderer, a lower Volume Density will improve performance at the expense of geometry detail.
  • For the Studio Lite renderer, a lower Perspective Limit will improve performance at the expense of coverage.
  • For either renderer, in the Mesh Source component > Advanced, check the Pause Data Generation When Invisible box to free up resources while the character is not in the viewer’s field of view.

Fantastic @CoryAllen like always very exact instructions ! Your are the beating heart dealing with depthKit … How would it be if the budget allows that I add in Arcturus? For example the mesh exporter workflow with ply and texture cord and then importing into HoloEdit and there exporting optimized with their existing presents … I really would like to impress the museum with the artistic project and the achieved quality to provoke them for later to ask me to do more and bigger stuff for them …