Am I alone in this?
Has anyone used this in a Unity 2019.1x HDRP project?
Need help with this Git issue:
“This project uses Git support on Package Manager to import external packages. To enable the functionality, Git must be installed on the system. See the forum thread for further details.”
Hello @NickVenden and @kkukshtel hope you’re well in this stange time!
New here, and have just started experimenting with Depthkit (Kinnect v2 + Windows) last month-excuse my lack of experience.
I’m a video installation artist working in VR for a practice-based PhD project, in collaboration with a sound artist and a circus performer. We would love to build on Keijiro Takahashi’s DKvfx template as it looks great for our project.
QUESTION: I have the DKvfx sample project open and working in Unity 2019.3.4 but can’t seem to figure out if my Kinnect v2 clips can be converted through Keijiro’s KlapHap player in the existing ‘Test’ scene or if I need to import something else in the project for this to work? His clip is already converted to a .mov file. and (i think) the two other Depth files are converted to render textures. Again, I could be missing something very basic here-so excuse this!
Looking forward to hearing about the upcoming Unity HDRP Depthkit plugin as well, I saw several people have been asking
Hi @OliviaMcGilchrist! It sounds like you may just need to re-encode your Depthkit clips. I recommend checking out this resource on transcoding to HAP. I hope this is helpful!
Hi @jillianmorrow it worked so thanks for the tip!
For those who might need this: Depthkit .MP4 clip conversion to HAP works with Adobe Media Encoder.
For those without Adobe access, free HAP conversion on Mac with ‘AVF Batch Exporter’, follow this link for the application: https://github.com/Vidvox/hap-in-avfoundation/releases
Hope this helps others!
Hi again @NickVenden and @jillianmorrow, hope things are ok for you atm.
Still exploring the Dkvfx project, and having a challenge with creating more than one instance of the Hap player as not a coder yet. Any suggestions would be great if you have some leads.
I got the HDRP demo working -Started a new HDRP project. Dropped in Kejiros files into the Assets folder. Changed the manifest.json file as instructed on his git repo. I found had to tweak the file path the to video Test.mov (I’m on Windows) and set it to local filesystem with an absolute path to the video. I then clicked selected the Vfx object, then on Asset Template, clicked Edit and it worked.
Been tinkering with the Vfx, but would love to get this working with the URP as @AndrewWhitney has done with native Depthkit clips as done here.
Hi @BenNeal, excuse my delayed response and that’s great! I’m open to being in touch about this as it can be quite challenging I’m still new to URP and have left it alone for a while as I’m writing a PhD thesis, but it’s really helpful & motivating to hear how other people are figuring this out! I’m doing a few tests this week so I’ll get back into it soon! Cheers from Montréal! (and I lived in the UK for 10 years)
Hi everyone, Got to grips with Keijiro’s DKVFX package which is incredible. I ave been using Depthkit and Keijiros DKVFX (and the sketches project) to create animated 3d assets that we plan on using in an AR Mobile app (IOS and Android) - we have hit a dead end with HDRP / Hap Player and rendering out for mobile. Has anyone used HDRP / Hap Player or used Universal Render Pipeline with this template to create assets for mobile and if so can you help? Desperate to animate some looks for export, the effects are simply stunning.
Hi there @RobertGraham, and excuse the slow reply! I’m glad to hear this and would love to know more! I don’t have any info that could help, but have you tried contacting Keijiro Takahashi direct? Have you tried this from his GitHub as well: https://github.com/keijiro/KlakHap
If you want to communicate direct, I’m happy to share my email or WhatsApp.
All the best!
Olivia
I began my journey into dkvfx this week. I wanted to test some of my brand new Depthkit Cinema captures. I’m currently running into an issue where unity is crashing after I type in the file name of my clip to insert it into Keijiro’s test project. I have troubleshooted this and have realized that if I export smaller resolution clips, Unity will not crash when I insert the clip into dkvfx and the file will work. However, because I have changed the resolution of my combined per pixel clip, the metadata no longer matches.
Have any of you ran into issues like this? To provide a bit more context, my combined per pixel clips out of Depthkit are various resolutions but are most around the 1680 x 4112 size. I’ve re-exported some clips to around 347/376 x 848 and they seem to work but again the metadata values are off and the clip is distorted.
I’m using a HAP encoder for adobe media encoder that I found off Github. I’m re-encoding my depthkit clips with plain HAP. I’m going to try and use a different encoder later today just in case that is the issue but I really feel like my resolution is the issue since some clips work and others do not. If I have to make smaller resolution clips, how do I convert the metadata to reflect these values? There are far too many factors in the metadata to change I feel like, but I could be wrong.
Hi All,
Somewhat late to the party here, but I’ve been trying to get this to work with some multisensor footage and am wondering just how many things must I be getting wrong?! Is there a tutorial on the basics of how to set this up correctly? I’ve converted my video to the HAP codec, but do I also need to convert the Depthkit text file associated with the video?
I’m just now starting my multicapture journey, so I’m very interested to hear about your progress! As you can see, I did dabble in single sensor capture with keijiro’s. I basically just used his example files and plugged in my footage. Anyway, things have changed like crazy since multicapture as I’m sure you’ve seen! Have you followed this tutorial yet?
Hey Andrew, thanks for getting in touch and thanks for the link! I managed to get started with the workflow, but was using the short video tutorial provided on Youtube, but this is much better. My multi sensor escapades have been ongoing and something like a stab in the dark as I’m not a hardcore dev by any means, so it’s slow going at times. I’ve got a multitude of problems including edge clean up, inheriting lighting and now this! haha. I’ll work that link though! maybe we can swap notes in a “sidebar”
Cheers
Keith
As @Andrew mentioned, our recommended path to VFX Looks is with our Depthkit Core & Studio Expansion Packages, which seamlessly integrate with Unity’s VFX Graph.
We are currently wrapping up documentation of our newest version (Phase 8) of these packages, which include additional tools for edge blending. The Core packages will be available to anyone who purchased the one-off license and the Studio packages will be available to current Depthkit Studio customers in the coming weeks.
Hi @CoryAllen, good to hear from you,
Thanks for the heads-up on the new material. Having been subscribed to the Summer Start Programme in the past (I’m currently sitting back here in “coach”), does this mean that I was able to / be able to update my Core packages to take advantage of some of these updates?