We had some problems with the shine of some musical instruments on our last production. For example, the double bass had holes because the color of the green screen was reflected there. Is this problem solved by not needing a green screen anymore? What is your experience with the varnish on the double bass in the example you - James - sent me?
The gloss on the trombone also caused some problems - regardless of the greenscreen. I had the impression that the kinect depth sensor could not work correctly because of the glossy surface.
The Range of the Azure sensors is not so wide.
I mean your lowest low light and your highest highlight cant be to far of each other.
Offcourse you can go the road of camera pairing acinema camera to each sensor, but this is then more work and more budget.
What you can do is
being NOT in an green room or lighting it in an very special VFX way that the bounce doesnt hit the perfromers but even then you may have green reflections
and NOT in an totally white room as this is anyway then refletced everywhere and doesnt look neither nice in skin areas
=>… a darker record area with really a lot of indirect light from above with a good distance that the the light has from head to toe the same brightness
would help you to ilumate the recording area NO highlight refections in your shiny instruments
but like always this need mostly a bit more budget …
always good is even when voluemtric shooting is somehow something else then traditional cinema shooting still having an expirienced DOP with you always helps
→ surprisingly
I am a “German Diplom” studied DOP who is since three years only focusing on volumetric recoridng
and your name is maybe from germany … ? …
would be nice to finally meet some other volumetric creators, who are not from the monster studios …
Adding a little to what @MartnD shared for specific considerations and guidance on shooting with instruments
First, for the sake of other readers on the forums, here is the double-bass example you are referencing. This was captured with 6 cameras during or Museum of Moving Image “Tales of the Holoverse” project. then processed to WebXR using Arcturus HoloEdit and HoloStream.
The Problems
Thin surfaces with small details and complex occlusion are difficult to reconstruct instruments often have very thin details like the neck of the bass, or a trumpet’s woven tubes. These details often are much smaller than the rated error levels of the Azure Kinect’s sensors, or the density of the reconstruction volume once captured, leading to them turning into noise or disappearing entirely. If these details are tightly packed, they often will occlude one another from the camera’s view leading to further difficulties.
Shiny Surfaces reflect the infrared lasers used for depth sensing The Azure Kinects uses active lasers emitted front he cameras to sense the surface. These light signals work best when hitting matte surfaces, and get disrupted when hitting reflective, transparent or very dark surfaces that absorb the light from returning to the camera. In these cases, the depth data streams will just read as blank, further challenging the reconstruction.
The Solutions
More cameras is better with props like instruments When shooting subjects instruments, we strongly recommend increasing the number of cameras from 6 to 10. This helps mitigate both issues above. First shiny surfaces will often pick up differently when seen from different angles, giving a better chance to recover the instrument when more cameras are capturing the subject from a variety of angles. Secondly, more cameras will help avoid occlusions allowing smaller details to be captured.
Experiment with surface treatments We’ve had a lot of success using matte spray like dry shampoo on difficult materials like instruments and human hair. We touch on this in our wardrobe recommendations.
Test!!
We recommend doing some low stakes practical tests with the instruments prior to capturing a critical performance. Having a chance to try different camera placements, matte treatments, etc, can significantly increase the quality of the final production.
Hope this is helpful! we’d love to hear from any community members who have also worked with musicians and instruments to hear what has worked before.
Like always very detailed… very nice … specially the fact of the Infrared lights issue in combination with shinny objects (had in the past terrible results during intensive tests) and the matte spray solution is a film classic but always good to get reminded …
did you ever consider do tests with two times depthKit studio with each computer with ten …
As you mentioned to gear up in some cases from 6 to 10
and then having a change to load from two different machines the files to have even a higher density… could be linked and synced via network connected like master and slave setting or sync with external sync device …
would be interesting to see results of 20 cams or 30 cams …
maybe even the slave computer doesn’t need the high graphic card as for example the usb speed is enough to get it recorded without visualization on the slave machine … all could be either 2,5ghz network connection or if needed 10ghz connection … this could be maybe something for the future road map as new crazy rare extra function
I am aware that here again the infrared light can become a problem but would be interesting test at least … and maybe then could be a solution to de-sync minimal to get the gaps of the frequency as a useful helper to avoid sensor blinding from an other sensor …
Hi @MartnD - Regarding your side question about more than 10 cameras, check out this post from a while back.
If you still have questions/ideas after reading that through feel free to post on that thread and we’ll continue the discussion there to keep this focused on capturing instruments.