Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question. Am I crazy or did I see somewhere support for 360 degree video? #404

Open
TheSashmo opened this issue Jul 31, 2024 · 9 comments
Open

Comments

@TheSashmo
Copy link

I am not sure where I saw it, but I could have sworn that I saw that there was stitching of some sort of various inputs. Or am I totally out to lunch (which is very possible).

@alatteri
Copy link

alatteri commented Jul 31, 2024

They have some level of stereo and VR sticking, but I've not paid much attention.

https://www.facebook.com/UltraGrid/posts/732707365220888?ref=embed_post

@TheSashmo
Copy link
Author

Thanks @alatteri lighting fast as usual!

Yeah, I know the 3D/Stereo stuff, but I was looking at something and could have sworn I saw a stitching ability across sources.

@alatteri
Copy link

there is a configure check for OpenXR VR Display, which I would think is some type of stiching.

@TheSashmo
Copy link
Author

OpenXR, that might be it. I don't see any documentation on that. Now that you triggered my memory I think it's pano_gl.

I need to check that out.

@mpiatka
Copy link
Collaborator

mpiatka commented Jul 31, 2024

We do support stitching a 360 degree panorama on nvidia GPUs with the gpustitch capture module and displaying it with the openxr_gl and pano_gl modules.

I'm not sure what state the documentation is in for these modules right now, but let me know what use cases you are interested in, and I'll try to help.

@TheSashmo
Copy link
Author

Thanks @mpiatka what methodology does it use to stich the video? Is there some sort of configuration file that it uses like ptgui to create a map?

@TheSashmo
Copy link
Author

I am able to get one source and use the pano_gl to work, but I don't see way to have more than one source.

@mpiatka
Copy link
Collaborator

mpiatka commented Aug 1, 2024

I am able to get one source and use the pano_gl to work, but I don't see way to have more than one source.

The pano_gl and openxr_gl modules are used to display an already stitched panorama and as such only take one source.

The stitching is done on the sender using the gpustitch capture module.

what methodology does it use to stich the video? Is there some sort of configuration file that it uses like ptgui to create a map?

The capture rig needs to be first calibrated to obtain the precise relative orientation of the cameras and the distortion coefficients of the optics. The UltraGrid gpustitch module then uses those to project all cameras into one 360 degree panorama and uses multi band blending to make the seams between cameras smoother and less noticeable. Note that this approach works best if there aren't any objects moving close to the camera rig as the parallaxing will introduce artifacts.

Currently only rigs with fisheye type lenses are supported. We had nice results using four Blackmagic Micro Studio Camera 4K with the Laowa 4mm f/2.8 Fisheye lenses mounted on a 360RIZE 360Helios 4.

To obtain the calibration values we used Hugin, but ptgui seems to be based on the same Panotools backend, so that will probably work too. The gpustitch then uses a configuration file in the TOML format which looks like this:

[[cameras]]
width = 3840
height = 2160
focal_len = 1185.429921
yaw = 0
pitch = 0
roll = 90
distortion = [-0.09994, 0.30684, -0.33116, 1.12426]
x_offset = -33.8
y_offset = 14.5
 
[[cameras]]
width = 3840
height = 2160
focal_len = 1185.429921
yaw = 90.366
pitch = 1.211
roll = 90.398
distortion = [-0.09994, 0.30684, -0.33116, 1.12426]
x_offset = -33.8
y_offset = 14.5
 
[[cameras]]
width = 3840
height = 2160
focal_len = 1185.429921
yaw = 178.857
pitch = 0.675
roll = 89.969
distortion = [-0.09994, 0.30684, -0.33116, 1.12426]
x_offset = -33.8
y_offset = 14.5
 
[[cameras]]
width = 3840
height = 2160
focal_len = 1185.429921
yaw = -90.549
pitch = 0.811
roll = 89.742
distortion = [-0.09994, 0.30684, -0.33116, 1.12426]
x_offset = -33.8
y_offset = 14.5

The focal_len is the focal length of the camera in pixels calculated as focal length in milimeters * (picture width in px / sensor size in milimeters). The yaw, pitch and roll are the camera rotations as obtained in calibration. The distortion values are reported by Hugin as lens parameters a, b, c, with the last one defined as 1 - (a + b + c). The x and y offsets are reported in Hugin as lens parameters d and e.

The gpustitch module the takes the input as either single tiled image (useful when using a blackmagic card) or multiple independent sources. E.g.:

-t gpustitch:width=7680:fmt=UYVY:blend_algo=multiband:multiband_levels=4:rig_spec=rig_spec.toml:tiled -t decklink

or

-t gpustitch:width=7680:fmt=UYVY:blend_algo=multiband:multiband_levels=4:rig_spec=rig_spec.toml -t <camera one> -t <camera two> -t <camera three> -t <camera four>

@TheSashmo
Copy link
Author

Thanks I will try to test this out this week!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants