Using the panorama tutorial as a reference:

https://agisoft.freshdesk.com/support/solutions/articles/31000148830We currently use Hugin to generate spherical panoramas and it is always able to successfully orient the panorama such that the horizon is level. Occasionally the process will fail, so I decided to try Metashape, but am running into issues where the model is slightly rotated such that the horizon is no longer level.

Step 4 in the above tutorial does show how we can correct this manually, but I would like to be able to do this programmatically, and have figured out that you can rotate upon Panorama export (Tasks.ExportPanorama.rotation), but my issue is determining the rotation matrix to apply to the model to get it aligned correctly. My question is how can I determine this rotation using the Python API?

For our datasets, they are captured by drones, so I believe I should be able to at least determine the up-down vector of the camera's position, and then rotate the model such that this vector is aligned with one of Metashape's axes. I see two cases:

1. if the drone only took horizontal images (gimbal angle is 0), then I would want to select 2 camera view vectors that are about 90 degrees to each other and perform an operation to get a vector that is orthogonal to both (cross product I think?)

2. most of the time, the drone will have taken horizontal images as well as images where the gimbal angle is non-zero (up to -90 degrees) so in this case I think I should be able to add all the vectors together which should zero out the horizontal components and leave me with a vector that is pointing up/down

Or is there an easier way to get this? Does the camera station have its orientation based on all the camera positions?

Thanks,

Tristen