Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to annotate keypoints in NDDS in the format specified in DREAM? #27

Open
Shashank-Prakash9 opened this issue Apr 5, 2023 · 4 comments

Comments

@Shashank-Prakash9
Copy link

I have installed NDDS but there seems to be no way of marking the keypoints. Any help on this matter will be greatly appreciated

@tabula-rosa
Copy link
Collaborator

Hi Shashank, thank you for your interest in DREAM!

For generating the DREAM synthetic datasets, we used the NDDS plug-in for Unreal Engine to export the keypoint information. Unfortunately, I no longer have access to this simulator, and it was not open-sourced for this project, so I am afraid that I can't provide support specifically for NDDS.

I would suggest posting your issue to the NDDS repository (https://github.com/NVIDIA/Dataset_Synthesizer) as they may be better equipped to provide support.

If there are questions about DREAM-specific usage, I might be able to help, but for NDDS more generally, I'm sorry I can't provide further support!

@TontonTremblay
Copy link
Collaborator

Yeah we could not make the NDDS + robots public. I would suggest to use nvisii + pybullet. I used it to generate watch it move data. I could probably push a script in the next couple weeks that could generate the right data, but I cannot guarantee it.

@Shashank-Prakash9
Copy link
Author

Thank you so much Tabitha and Jonathan. I am thankful for your responses. Jonathan i will definitely explore nvisii and pybullet for the data generation process.My only further query would be , Was the keypoint annotation a feature your team added to NDDS ? or was it already available since I already have a robot model loaded but haven't been able to annotate keypoints.

@TontonTremblay
Copy link
Collaborator

It was an added feature for NDDS. Back then we had experts in UE4 helping us building the features we needed. I am afraid that my skills are quite limited to add such feature. But the idea is that you can label a specific point on an asset you want exported in the image space. You easily do that with nvisii and script I wrote.

https://github.com/NVlabs/Deep_Object_Pose/tree/master/scripts/nvisii_data_gen This will get you to about 80% there.

Things to update:

  1. load the robot instead of normal objects: https://github.com/NVlabs/Deep_Object_Pose/blob/master/scripts/nvisii_data_gen/single_video_pybullet.py#L410
  2. Update the robot pose + nvisii object state: https://github.com/owl-project/NVISII/blob/master/examples/24.urdf.py this is the skeleton on how you can achieve this.
  3. Export the joint position keypoints in image space: https://github.com/NVlabs/Deep_Object_Pose/blob/43b685062e79caae921438a220a133895931261c/scripts/nvisii_data_gen/utils.py#L998 This does it for the cuboid around an object. But the way the cuboid is created could simply be hacked to add a keypooint at 0,0,0 of the joint in its local frame than export that single child.

These steps should be somewhat easy to hack. Sorry for vague directions, this assumes that you have quite a bit of 3d knowledge and I am so sorry if it is not the case, this might be quite a bit for someone to debug alone. So if you decide to start in that direction, I would be happy to answer any questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants