We had some problems with the shine of some musical instruments on our last production. For example, the double bass had holes because the color of the green screen was reflected there. Is this problem solved by not needing a green screen anymore? What is your experience with the varnish on the double bass in the example you - James - sent me?
The gloss on the trombone also caused some problems - regardless of the greenscreen. I had the impression that the kinect depth sensor could not work correctly because of the glossy surface.
Thin surfaces with small details and complex occlusion are difficult to reconstruct instruments often have very thin details like the neck of the bass, or a trumpet’s woven tubes. These details often are much smaller than the rated error levels of the Azure Kinect’s sensors, or the density of the reconstruction volume once captured, leading to them turning into noise or disappearing entirely. If these details are tightly packed, they often will occlude one another from the camera’s view leading to further difficulties.
Shiny Surfaces reflect the infrared lasers used for depth sensing The Azure Kinects uses active lasers emitted front he cameras to sense the surface. These light signals work best when hitting matte surfaces, and get disrupted when hitting reflective, transparent or very dark surfaces that absorb the light from returning to the camera. In these cases, the depth data streams will just read as blank, further challenging the reconstruction.
More cameras is better with props like instruments When shooting subjects instruments, we strongly recommend increasing the number of cameras from 6 to 10. This helps mitigate both issues above. First shiny surfaces will often pick up differently when seen from different angles, giving a better chance to recover the instrument when more cameras are capturing the subject from a variety of angles. Secondly, more cameras will help avoid occlusions allowing smaller details to be captured.
Experiment with surface treatments We’ve had a lot of success using matte spray like dry shampoo on difficult materials like instruments and human hair. We touch on this in our wardrobe recommendations.
We recommend doing some low stakes practical tests with the instruments prior to capturing a critical performance. Having a chance to try different camera placements, matte treatments, etc, can significantly increase the quality of the final production.
Hope this is helpful! we’d love to hear from any community members who have also worked with musicians and instruments to hear what has worked before.
Like always very detailed… very nice … specially the fact of the Infrared lights issue in combination with shinny objects (had in the past terrible results during intensive tests) and the matte spray solution is a film classic but always good to get reminded …
did you ever consider do tests with two times depthKit studio with each computer with ten …
As you mentioned to gear up in some cases from 6 to 10
and then having a change to load from two different machines the files to have even a higher density… could be linked and synced via network connected like master and slave setting or sync with external sync device …
would be interesting to see results of 20 cams or 30 cams …
maybe even the slave computer doesn’t need the high graphic card as for example the usb speed is enough to get it recorded without visualization on the slave machine … all could be either 2,5ghz network connection or if needed 10ghz connection … this could be maybe something for the future road map as new crazy rare extra function
I am aware that here again the infrared light can become a problem but would be interesting test at least … and maybe then could be a solution to de-sync minimal to get the gaps of the frequency as a useful helper to avoid sensor blinding from an other sensor …