Femto Bolt with Azure Kinect

Would DepthKit Studio still function if I synchronised an Azure Kinect in star formation alongside several Fembot Bolts using the Fembot multi-sensor sync system without change?

If you think it might, I might try it. Then I would start adding AK’s set up in the same way.

Using cat5 to audio sync cables used for AK. @

Also, if I had a PC otherwise conforming to your specs that could cope with more than 10 sensors, would DepthKit Studio support it?




@Terence Currently, Depthkit Studio works with up to 10 sensors of the same type at a time, meaning it’s not possible to capture with both Femto Bolts and Azure Kinects simultaneously to the same computer, as mentioned in the sensor section of the Depthkit Studio documentation. This is primarily due to the differing sync systems between the two sensor types: Microsoft has published their sync signal specifications (enabling external sync Depthkit Studio workflows), however the sync protocol of the Femto Bolt sensors is not published, and is instead managed with Orbbec’s Sync Hub Pro hardware.

Additionally, Depthkit’s 10-sensor limit was established by Microsoft due to the physics of time-of-flight sensors - Even when coordinated, only so many sensors can fire their emitter within a single frame’s slice of time before those emitters start to interfere with the depth readings of other sensors. For more in-depth information, see Microsoft’s documentation on the subject. The Femto Bolt sensors use identical time-of-flight depth cameras, so they are subject to the same limit. If you set up two separate Depthkit systems in the same area, with each using either type of sensor, you can see this interference manifest as flashing/flickering in the depth data, so we recommend capturing with all sensors sync’ed to avoid this.

Thanks Cory. I understand. It helps me explain these limitations in my dissertation.


has explained it more than well, like always!!! → Anyway, I have experience with 12 sensors running over two machines. The discontinued software called EFEVE had the function of a master-slave computer, and you could run 10 Azure Kinect on each system and then set “off-time sync” between the machines. After recording on two systems, the files needed to be copied to one machine, and then the matching algorithm of EFEVE did its job.
I have to admit that the matching was by far not as good as in depthkit, but with the right code and algorithm approach, working with more sensors than 10 would be possible … if this ever gets important enough for a road map of Depthkit.

Greetings Martn

Thanks Martin. I will contact you.
Best wishes

Do you not get the interference Cory referred to in his response?