Dear community,
I have used an external tool to match 2 images. I am trying to assess the accuracy of the tie points that were detected. I have also matched the images with Metashape, from which I also got an interior and exterior orientation of high accuracy.
I used Colmap to generate a 3D model (sparse point cloud) of my tie points and I have imported the model into metashape. Now I am trying to "force" the exterior and interior orientation previously estimated to optimise the sparse point cloud (ie. better 3D coordinates of the tie points in object space).
I managed to fix the interior orientation via the Calibration tool, however I don't know how to fix the exterior orientation. My first question is: can we do it at all ? in the GUI or preferably in the python API?
In the GUI, I have managed to manually set X, Y, Z, Yaw, Pitch, Roll values for the cameras (using F2), I have set a very hogh accuracy and this way I got a closer estimation to the exterior orientation I wanted but not the exact values.
Thanks for your help