Hello,
I'm working on a project with a sequence of geo-localized images (EPSG:4326 + elevation). After image alignment, the estimated yaw, pitch and roll angles shown in the "Reference" panel seem to correspond to a rotation from an earth centered frame (ECEF cartesian frame) to the different camera frames.
After converting the coordinates of the images to a cartesian coordinate system (EPSG:2154), I would expect to obtain yaw pitch and roll angles computed from the frame of the EPSG:2154 coordinate system, but they don't seem to change, even after realigning the images.
I used the python API to access the transformations between the ECEF cartesian frame and the chunk, and to get the transformations between the chunk and the different camera frames, but I didn't managed to integrate the EPSG:2154 information.
How can I obtain yaw, pitch and roll values that correspond to a rotation in the EPSG:2154 coordinate system, where if all the angles are 0, the frame is aligned on the West-East, South-North and bottom-up axes?
Thank you